muse

Keynote

I tuned into the Apple keynote today. I mostly just listened while having real work to do. But something really caught my attention that I ended up watching all some 20 min. of that segment. It wouldn’t be hard to guess, but it was the section on Augmented Reality. Apple’s doing a lot of incredible things with AR technology and I want in.

ARKit2 has expanded it’s spectrum, allowing for more advanced and accurate real-life tracking of surroundings. This includes live measuring of objects just by clicking from one point to another through their new app ‘Measure’. It can also register geometric shapes (demonstrated was the rectangle, but I imagine other polygonal shapes are possible) and give a relatively accurate measurement of it’s dimensions as well as area. The other neat demonstration was a completely AR interactive game just from registering one built lego building on a table. The ipad was able to detect and then present an entire lego world directly on the table for the users to interact with. Note the use of plural. Now we can have multi-user interaction, live and augmented to reality all from one scan of the surrounding.

You can see, then, where I want to go with AR/TIFACT, relative to these features. What if it weren’t just a lego building that could get scanned, but a real building? In the AEC industry, aside from having a fun filled AR interaction video game with real life buildings, I would like to use this feature to overlay building data – not just your typical building stats and numbers, but 3D details and material samples. Is that useful? I think not quite yet. But this is the vein I want to keep thinking along until it does become something useful. Some great testing grounds for this? Doors Open Toronto? The Venice Biennale?

The next step in my research would be to get a hands-on for ARKit2 and see where I can take it. I’m excited.

Also Test 003 is coming, I promise.

Standard
muse

Pilot

Three years ago, I worked on an independent study with my classmate studying Architecture of the Virtual. We explored VR technology and developed a playable ‘level’ that altered gravity to manipulate a user’s perception of architecture. Architecture of the virtual is expanding more than ever.

Two years ago, I collaborated with another classmate designing an urban strategy and development for a strip of land adjacent to the Tonghui River in Beijing, China. This proposal stemmed from an early exploration of a community driven database of building information through Augmented Reality. (see our first unsettlingly ugly collage proposal) The idea was later shelved as both of us lacked the time, resources and knowledge to take it to the next level.

One year ago, I worked on myself – my working experience and connection building within the industry. I wrote my licensing exam and I continued to hone my tools of the trade. I did some small competitions with no results and I started some minor drawing and writing projects for fun.

Today I’ve decided to take all those experiences and ideas and take them off the shelf. I’m excited to take a stab at bringing an idea I cooked up those years ago, to fruition. Whether I have a viable and usable product in the end is completely unknown, but what I can guarantee is that I’m going to do a lot of learning and a lot of failing to get somewhere farther than right now.

ARTIFACT Laboratory is my new project.
I’m starting small but trying to think big.

My vision utilizes the technology currently available in AR development with combination of GPS tracking to bring a new layer of community driven social interaction within the city. We’ve seen it in action already through the infamous Pokemon Go where your location allows you access to different pokemon to catch. There’s also several notable AR projects in the AEC industry for displaying and visualizing design projects on real sites such as Darf Design’s ARKi. My interests lie somewhere in between.

Stay tuned

Standard