I tuned into the Apple keynote today. I mostly just listened while having real work to do. But something really caught my attention that I ended up watching all some 20 min. of that segment. It wouldn’t be hard to guess, but it was the section on Augmented Reality. Apple’s doing a lot of incredible things with AR technology and I want in.
ARKit2 has expanded it’s spectrum, allowing for more advanced and accurate real-life tracking of surroundings. This includes live measuring of objects just by clicking from one point to another through their new app ‘Measure’. It can also register geometric shapes (demonstrated was the rectangle, but I imagine other polygonal shapes are possible) and give a relatively accurate measurement of it’s dimensions as well as area. The other neat demonstration was a completely AR interactive game just from registering one built lego building on a table. The ipad was able to detect and then present an entire lego world directly on the table for the users to interact with. Note the use of plural. Now we can have multi-user interaction, live and augmented to reality all from one scan of the surrounding.
You can see, then, where I want to go with AR/TIFACT, relative to these features. What if it weren’t just a lego building that could get scanned, but a real building? In the AEC industry, aside from having a fun filled AR interaction video game with real life buildings, I would like to use this feature to overlay building data – not just your typical building stats and numbers, but 3D details and material samples. Is that useful? I think not quite yet. But this is the vein I want to keep thinking along until it does become something useful. Some great testing grounds for this? Doors Open Toronto? The Venice Biennale?
The next step in my research would be to get a hands-on for ARKit2 and see where I can take it. I’m excited.
Also Test 003 is coming, I promise.