All posts in “Augmented Reality”

Kayak’s new AR feature will tell you if your carry-on bag fits the overhead bin

Popular travel app Kayak has put augmented reality to clever use with a new feature that lets you measure the size of your carry-on bag using just your smartphone. Its updated iOS app now takes advantage of Apple’s ARKit technology to introduce a new Bag Measurement tool that will help you calculate your bag’s size so you can find out if it fits in the overhead bin – you know, before your trip.

The tool is handy because the dimensions of permitted carry-on luggage can differ from airline to airline, Kayak explains, so it’s not as simple these days to figure out if your bag will fit.

In the new Kayak iOS app, you can access the measurement tool through the Flight Search feature.

The app will first prompt you to scan the floor in order to calibrate the measurements. You then move your phone around the bag to capture its size. Kayak’s app will do the math and return the bag’s size, in terms of length, width, and height.

And it will tell you if the bag “looks good” or not to meet the carry-on size requirements.

[embedded content]

Plus, the company says it compares all the airlines’ baggage size requirements in one place, so you’ll know for sure if it will be allowed by the airline you’re flying.

Augmented reality applications, so far, have been a mixed bag. (Sorry).

Some applications can be fairly useful  – like visualizing furniture placed in a room or trying on new makeup colors. (Yes, really. I’m serious). But others are more questionable – like some AR gaming apps, perhaps. (For example, how long would you play that AR slingshot game?)

But one area where AR has held up better is in helping you measure stuff with your phone – so much so that even Apple threw in its own AR measuring tape with iOS 12.

Kayak’s tool, also timed with the release of iOS 12, is among those more practical applications.

The company says the AR feature is currently only live on updated iOS devices.

Blippar picks up $37 million hoping to become profitable in the next year

Blippar, the AR startup that launched in 2011, has today announced the close of a $37 million financing led by Candy Ventures and Qualcomm Ventures.

The company started out by offering AR experiences for brand marketers through publishers and other real-world products, letting users unlock AR content by scanning a tag called a “Blipp”.

Blippar then transitioned to a number of different AR products, but took a particular focus on computer vision, launching a consumer-facing visual search engine that would let users identify cars, plants, and other real-world objects.

Most recently, Blippar has introduced an indoor positioning system that lets commercial real estate owners implement AR mapping and other content from within their buildings.

The AR industry has been in a state of evolution for the past few years, and Blippar has constantly reshifted and re-positioned to try and take advantage of the blossoming market. Unfortunately, several pivots have put the company in a tough spot financially.

BI reports that Blippar posted revenue of £8.5 million ($11.2 million) in the 16-month period up to March 31 2016, with losses of £24 million ($31.5 million). These latest rounds have essentially let Blippar keep the lights on while trying to pick up the pace on revenues. 

The company says that this latest round is meant to fuel the company’s race to reach profitability in the next 12 months. Blippar has raised more than $137 million to date.

Ebay’s HeadGaze lets motor-impaired users navigate the site with head movements

The sophisticated head-tracking system like the one built into the iPhone X may have been intended for AR and security purposes, but it may also turn out to be very useful for people with disabilities. A proof of concept app from an eBay intern shows how someone with very little motor function can navigate the site with nothing but head movements.

Muratcan Cicek is one such person, and relies on assistive technology every day to read, work, and get around. This year he was interning at eBay and decided to create a tool that would help people with motor impairments like his to shop online. Turns out there are lots of general-purpose tools for accessibility, like letting a user control a cursor with their eyes or a joystick, but nothing made just for navigating a site like eBay or Amazon.

His creation, HeadGaze, relies on the iPhone X’s front-facing sensor array (via ARKit) to track the user’s head movements. Different movements correspond to different actions in a demonstration app that shows the online retailer’s daily deals: navigate through categories and products by tilting your head all the way in various directions, or tilt partway down to buy, save, or share.

You can see it in action in the short video below:

[embedded content]

It’s not that this is some huge revolution in interface — there are some apps and services that do this, though perhaps not in such a straightforward and extensible way as this.

But it’s easy to underestimate the cognitive load created when someone has to navigate a UI that’s designed around senses or limbs they don’t have. To create something like this isn’t necessarily simple, but it’s useful and relatively straightforward, and the benefits to a person like Cicek are substantial.

That’s probably why he made the HeadGaze project open source — you can get all the code and documentation at GitHub; it’s all in Swift and currently only works on the iPhone X, but it’s a start.

Considering this was a summer project by an intern, there’s not much of an excuse for companies with thousands of developers to not have something like it available for their apps or storefronts. And it’s not like you couldn’t think of other ways to use it. As Cicek writes:

HeadGaze enables you to scroll and interact on your phone with only subtle head movements. Think of all the ways that this could be brought to life. Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?

He and his colleagues are also looking into actual gaze-tracking to augment the head movements, but that’s still a ways off. Maybe you can help.

Apple takes a step towards its own version of Google Glass

Apple's AR headset plans take a step forward.
Apple’s AR headset plans take a step forward.

Image: Justin Sullivan/Getty Images

There’s long been murmurs about Apple making its own augmented reality glasses, like Google Glass, and its latest acquisition is another step towards that.

Apple has acquired Denver-based AR lens startup Akonia Holographics, according to a report by Reuters.

Founded in 2012, the startup focused on holographic data storage before shifting to smart glass technologies. 

Akonia’s HoloMirror smart glass utilises a single layer of media, and the company boasts “ultra-clear, full-color performance … [enabling] the thinnest, lightest head worn displays in the world.”

In a statement to Reuters, an Apple spokesperson (as per usual) didn’t give much away: “Apple buys smaller companies from time to time, and we generally don’t discuss our purpose or plans.”

An executive in the AR industry told the outlet that Akonia had become “very quiet” in the six months leading to the acquisition, indicating the deal happened earlier this year.

Back in April, CNET reported Apple was working on a combination AR/VR headset with the codename “T288,” supporting both AR and VR apps, that would launch by 2020.

Each lens would reportedly feature an 8K display per eye, for a total resolution of 16K. That would eclipse the likes of current VR rivals Oculus Rift and the HTC Vive, which both only have 1,080 x 1,200 resolution per eye.

Of course, it’s still early days, and Apple’s headset dreams might just stay that way.

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f86531%2f2d9249ac 5d9e 4fa6 8466 793e4110ec8f

Teardown of Magic Leap One reveals highly advanced placeholder tech

The screwdriver-happy dismantlers at iFixit have torn the Magic Leap One augmented reality headset all to pieces, and the takeaway seems to be that the device is very much a work in progress — but a highly advanced one. Its interesting optical assembly, described as “surprisingly ugly,” is laid bare for all to see.

The head-mounted display and accompanying computing unit are definitely meant for developers, as we know, but the basic methods and construction Magic Leap is pursuing are clear from this initial hardware. It’s unlikely that there will be major changes to how the gadget works except to make it cheaper, lighter, and more reliable.

At the heart of Magic Leap’s tech is its AR display, which overlays 3D images over and around the real world. This is accomplished through a stack of waveguides that allow light to pass along them invisibly, then bounce it out towards your eye from the proper angle to form the image you see.

The “ugly” assembly in question – pic courtesy of iFixit.

The waveguide assembly has 6 layers: one for each color channel (red, blue, and green) twice over, arranged so that by adjusting the image you can change the perceived distance and size of the object being displayed.

There isn’t a lot out there like this, and certainly nothing intended for consumer use, so we can forgive Magic Leap for shipping something a little bit inelegant by iFixit’s standards: “The insides of the lenses are surprisingly ugly, with prominent IR LEDs, a visibly striated waveguide “display” area, and some odd glue application.”

After all, the insides of devices like the iPhone X or Galaxy Note 9 should and do reflect a more mature hardware ecosystem and many iterations of design along the same lines. This is a unique, first-of-its-kind device and as a devkit the focus is squarely on getting the functionality out there. It will almost certainly be refined in numerous ways to avoid future chiding by hardware snobs.

That’s also evident from the eye-tracking setup, which from its position at the bottom of the eye will likely perform better when you’re looking down and straight ahead rather than upwards. Future versions may include more robust tracking systems.

Another interesting piece is the motion-tracking setup. A little box hanging off the edge of the headset is speculated to be the receiver for the magnetic field-based motion controller. I remember using magnetic interference motion controllers back in 2010 — no doubt there have been improvements, but this doesn’t seem to be particularly cutting edge tech. An improved control scheme can probably be expected in future iterations, as this little setup is pretty much independent of the rest of the device’s operation.

Let’s not judge Magic Leap on this interesting public prototype — let us instead judge them on the farcically ostentatious promises and eye-popping funding of the last few years. If they haven’t burned through all that cash, there are years of development left in the creation of a practical and affordable consumer device using these principles and equipment. Many more teardowns to come!