All posts in “Augmented Reality”

Medivis has launched its augmented reality platform for surgical planning

After two years of development, Medivis, a New York-based company developing augmented reality data integration and visualization tools for surgeons, is bringing its first product to market.

The company was founded by Osamah Choudhry and Christopher Morley who met as senior residents at NYU Medical Center.

Initially a side-project, the two residents roped in some engineers to help develop their first prototypes and after a stint in NYU’s Summer Launchpad program the two decided to launch the company.

Now, with $2.3 million in financing led by Initialized Capital and partnerships with Dell and Microsoft to supply hardware, the company is launching its first product, called SurgicalAR.

In fact, it was the launch of the HoloLens that really gave Medivis its boost, according to Morley. That technology pointed a way toward what Morley said was one of the dreams for technology in the medical industry.

“The Holy Grail is to be able to holographically render this a,” he said.

For now, Medivis is able to access patient data and represent it visually in a three dimensional model for doctors to refer to as they plan surgeries. That model is mapped back to the patient to give surgeons a plan for how best to approach an operation.

“The interface between medical imaging and surgical utility from it is really where we see a lot of innovation being possible,” says Morley.

So far, Medivis has worked with the University of Pennsylvania and New York University to bring their prototypes into a surgical setting.

The company is integrating some machine learning capabilities to be able to identify the most relevant information from patients’ medical records and diagnostics as they begin to plan the surgical process.

“What we’ve been working on over this time is developing this really disrupt 3D pipeline,” says Morley. “What we have seen is that there is a distinct lack of 3D pipelines to allow people to directly interface… very quickly try to automate the entire rendering process.”

For now, Medivis is elling a touchscreen monitor, display and a headset. The device plugs into a hospital network and extracts medical imaging to display from their servers in about 30 seconds, according to Choudhry.

“That’s where we see this immediately being useful in that pre-surgical planning stage,” Choudhry says. “The use in surgical planning and being able to extend this through surgical navigation.. Streamline the process that requires a large amount of pieces and components and setups so you only need an AR headset to localize pathology and make decisions off of that.”

Already the company has performed 15 surgeries in consultation with the company’s technology.

“When we first met Osamah and Chris, we immediately understood the magnitude of the problem they were out to solve. Medical imaging as it relates to surgical procedures has largely been neglected, leaving patients open to all sorts of complications and general safety issues,” said Eric Woersching, General Partner, Initialized Capital, in a statement. “We took one look at Medivis platform and knew they were poised to transform the operating room. Not only was their hands-free approach to visualization meeting a real need for greater surgical accuracy, but the team has the passion and expertise in the medical field to bring it all to fruition. We couldn’t be more thrilled to welcome Medivis to the Initialized family.”

Asteroid is building a human-machine interaction engine for AR developers

When we interact with computers today we move the mouse, we scroll the trackpad, we tap the screen, but there is so much that the machines don’t pick up on, what about where we’re looking, the subtle gestures we make and what we’re thinking?

Asteroid is looking to get developers comfortable with the idea that future interfaces are going to take in much more biosensory data. The team has built a node-based human-machine interface engine for macOS and iOS that allows developers to build interactions that can be imported into Swift applications.

“What’s interesting about emerging human-machine interface tech is the hope that the user may be able to “upload” as much as they can “download” today,”Asteroid founder Saku Panditharatne wrote in a Medium post.

To bring attention to their development environment, they’ve launched a crowdfunding campaign that gives a decent snapshot of the depth of experiences that can be enabled by today’s commercially available biosensors. Asteroid definitely doesn’t want to be a hardware startup, but their campaign is largely serving as a way to just expose developers to what tools could be in their interaction design arsenal.

There are dev kits and then there are dev kits, and this is a dev kit. Developers jumping on board for the total package get a bunch of open hardware, i.e. a bunch of gear and cases to build out hacked together interface solutions. The $450 kit brings capabilities like eye-tracking, brain-computer interface electrodes, and some gear to piece together a motion controller. Backers can also just buy the $200 eye-tracking kit alone. It’s all very utility-minded and clearly not designed to make Asteroid those big hardware bucks.

“The long-term goal is to support as much AR hardware as we can, we just made our own kit because I don’t think there is that much good stuff out there outside of labs,” Panditharatne told TechCrunch.

The crazy hardware seems to be a bit of a labor of love for the time being, while a couple AR/VR devices have eye-tracking baked-in, it’s still a generation away from most consumer VR devices and you’re certainly not going to find too much hardware with brain-computer interface systems built-in. The startup says their engine will do plenty with just a smartphone camera and a microphone, but the broader sell with the dev kit is that you’re not building for a specific piece of hardware, you’re experimenting on the bet that interfaces are going to grow more closely intertwined with how we process the world as humans.

Panditharatne founded the company after stints at Oculus and Andreessen Horowitz where she spent a lot of time focusing on the future of AR and VR. Panditharatne tells us that Asteroid has raised over $2 million in funding already but that they’re not detailing the sources of that cash quite yet.

The company is looking to raise $20k from their Indiegogo but the platform is the clear sell here, exposing people to their human-machine interaction engine. Asteroid is taking sign-ups to join the waiting list for the product on their site.

Apple is getting serious about augmented reality with this new appointment

Disclosure

Every product here is independently selected by Mashable journalists. If you buy something featured, we may earn an affiliate commission which helps support our work.

Apple is ramping up its AR efforts.
Apple is ramping up its AR efforts.

Image: APple

For most people, AR is Pokémon Go and little else. But for Apple, it’s an important part of the company’s future

Apple just re-assigned Frank Casanove, former lead for iPhone Partner Marketing, into a new role: chief of Product Marketing for AR, Bloombreg reported Tuesday. 

Casanova changed his LinkedIn profile bio to reflect the new title. “I’m responsible for all aspects of Product Marketing for Apple’s Augmented Reality initiative,” the bio says. 

Casanova, an Apple veteran of 30 years, was there for the first iPhone launch in 2007. And before that, he served nearly ten years as Senior Director for MacOS X Graphics, Audio and Video. 

The fact that Casanova is now the face behind Apple’s AR marketing efforts is telling. It’s yet another indication that the company has big plans for the technology — and it may include new hardware and software products, as well as services, in the future. 

Apple has been pushing AR as an important feature for years, launching its AR development platform, ARKit, in 2017. The platform got a slew of significant improvements in iOS 12.

A recent report from Bloomberg said Apple plans to launch iPhones with a laser-based 3D camera as soon as 2020. Such a camera would improve the iPhone’s Face ID feature, but would primarily be used for improving AR, allowing for better placement of virtual objects on the phone’s display, the report said. 

Furthermore, Apple may be working on an AR headset, which could also come in 2020. The rumors of this mythical device have seemingly been flying around forever, though, and are yet to materialize in any palpable way. 

In fact, Apple CEO Tim Cook shot down the rumors about the headset back in 2017. But he has also been pointing out AR as a really important technology, potentially as big as the smartphone itself. 

As always, it’s hard to tell what Apple is up to — the company doesn’t talk about unreleased products, and many of its competitors tried their hands at VR/AR hardware, with poor results. But the company is definitely up to something. All the signs are there, and we might just hear a lot more about Apple’s AR efforts as soon as next year. 

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90369%252f567374bf 8ae9 4e15 a5de 4beab1a92ae1.jpg%252foriginal.jpg?signature=bqbdbd urmme u 0zjsaqt1txvi=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Facebook picks up retail computer vision outfit GrokStyle

If you’ve ever seen a lamp or chair that you liked and wished you could just take a picture and find it online, well, GrokStyle let you do that — and now the company has been snatched up by Facebook to augments its own growing computer vision department.

GrokStyle started as a paper — as AI companies often do these days — at 2015’s SIGGRAPH. A National Science Foundation grant got the ball rolling on the actual company, and in 2017 founders Kavita Bala and Sean Bell raised $2 million to grow it.

The basic idea is simple: matching a piece of furniture (or a light fixture, or any of a variety of product types) in an image to visually similar ones in stock at stores. Of course, sometimes the simplest ideas are the most difficult to execute. But Bala and Bell made it work, and it was impressive enough in action that IKEA on first sight demanded it be in the next release of its app. I saw it in action and it’s pretty impressive.

Facebook’s acquisition of the company (no terms disclosed) makes sense on a couple fronts: First, the company is investing heavily in computer vision and AI, so GrokStyle and its founders are naturally potential targets. Second, Facebook is also trying to invest in its marketplace, and using the camera as an interface for it fits right into the company’s philosophy.

One can imagine how useful it would be to be able to pull up the Facebook camera app, point it at a lamp you like at a hotel, and see who’s selling it or something like it on the site.

Facebook did not answer my questions regarding how GrokStyle’s tech and team would be used, but offered the following statement: “We are excited to welcome GrokStyle to Facebook. Their team and technology will contribute to our AI capabilities.” Well!

There’s an “exciting journey” message on GrokStyle’s webpage, so the old site and service is gone for good. But one assumes that it will reappear in some form in the future. I’ve asked the founders for comment and will update the post if I hear back.

Warby Parker dips into AR with the launch of virtual try-on

Warby Parker is today introducing virtual try-on to let shoppers select a pair of frames and instantly see how they look.

The tech was built on Apple’s ARKit, and the feature is only available to users on the Warby Parker iOS app on an iPhone X or later.

Warby Parker, which launched in 2010, attempted to implement a virtual try-on feature on its website, but pulled the feature shortly after it debuted. The issue?

With something like glasses, virtual try-on needs to be as close to reality as possible. Virtual objects can’t be overlaid ‘close to’ the user’s face, but rather match up with all their facial curves, and the placement of the ears, eyes and nose.

“It was really our first time building out a full AR feature as a company, and there were two things that were really important,” said Sr. Director of E-Commerce and Consumer Insights Erin Collins. “The first was getting fit right, which was a technical challenge that required a bunch of revisions. And the second thing was making sure the frame images looked as photorealistic as possible, which meant getting 3D artists to digital render them and lots of revisions to get it pixel perfect on each pair of frames.”

The technology Warby Parker built uses a proprietary algorithm to perfectly place virtual frames on the user’s face. The feature also allows users to quickly snap a screenshot and share with others to get feedback on the frames.

Since inception, Warby Parker developed its ecommerce brand on the back of a relatively low-tech feature: in-home try-on. The company simply sent users five frames of their choice to try on at home and send back later, once they’d made their purchasing decision.

Collins sees the new virtual try-on as a great compliment to that program, while offering a quick and convenient experience for repeat buyers.

“This will make it easier for returning customers to buy glasses without trying them on, but we’re really excited about it as a tool for people to narrow down their home try-on choices,” said Collins.

Warby Parker has raised a total of nearly $300 million in funding from investors such as T.Rowe Price, Tiger Global Management and General Catalyst.