All posts in “Technology”

Regulus Cyber launches with a technology to secure autonomous vehicles

Update: This post has been updated to reflect that Regulus’ chief executive is Yonatan Zur. 

Over the next 20 years the autonomous vehicle market is expected to grow into a $700 billion industry as robots take over nearly every aspect of mobility.

One of the key arguments for this shift away from manually operated machines is that they offer greater safety thanks to less risk of human error. But as these autonomous vehicles proliferate, there needs to be a way to ensure that these systems aren’t exposed to the same kinds of hacking threats that have bedeviled the tech industry since its creation.

It’s the rationale behind Regulus Cyber, a new Israeli security technology developer founded by Yonatan Zur and Yoav Zangvil — two longtime professionals from Israel’s aerospace and defense industry.

“We’re building a system that is looking at different sensors and the first system is GPS,” Zur says. Using a proprietary array of off-the-shelf antennas and software developed internally, the system Regulus has designed can determine whether a GPS signal is legitimate or has been spoofed by a hacker (think of it as a way to defend against the kind of hack used by the bad guys in “Die Hard 2“).

Zunger first had the idea to launch the company three years ago while he was working with drones at the Israeli technology firm, Elbit. At the time, militaries were beginning to develop technologies to combat drone operations and Zunger figured it was only a matter of time before those technologies made their way into the commercial drone market as well.

While the technology works for unmanned aerial vehicles, it also has applications for pretty much any type of autonomous transportation technology.

Backing the company are a clutch of well-known Israeli and American investors, including Sierra Ventures, Canaan Partners Israel, Technion and F2 Capital.

Regulus, which raised $6.3 million in financing before emerging from stealth, said the money will be used to expand its sales and marketing efforts and to continue to develop its technology.

The company’s first two products are a spoofing protection module that integrates with any autonomous vehicle and a communication security manager that protects against hacking and misdirection.

“We are very excited to lead this round of financing. Sensors security for autonomous machines will become as important as processors security. Regulus identified the key vulnerabilities and developed the best-in-class solutions,” said Ben Yu, a managing director of Sierra Ventures, in a statement. “Having been working with the company since seed funding, Sierra invested with strong confidence in the team to build Regulus into the category leader.”

Leaked iPhone pics show glass back and headphone jack

The headphone jack could still have a future in an iPhone. These leaked pics show an iPhone SE 2 with a glass back and headphone jack. Like the current iPhone SE, the design seems to be a take on the classic iPhone 5. I dig it.

The leak also states the upcoming device sports wireless charging, which puts it inline with the iPhone 8 and iPhone X.

Rumors have long stated that Apple was working on an updated iPhone SE. The original was released in March 16 and updated a year later with improved specs. With a 4-inch screen, the iPhone SE is the smallest iPhone Apple offers and also the cheapest.

WWDC in early June is the next major Apple event and could play host for the launch of this phone. Last month, around the iPhone SE’s birthday, Apple held a special event in a Chicago school to launch an education-focused iPad. It’s logical that Apple pushed the launch of this new iPhone SE to WWDC to give the iPad event breathing room.

While Apple cut the headphone jack from its flagship devices, the SE looks to retain the connection. It makes sense. The low-cost iPhone is key for Apple in growing markets across the world where the last two models helped grow iOS’s market penetration. This is Apple’s low-cost offering and thus suggests Apple doesn’t expect buyers to also spring for its wireless earbuds.

If released at WWDC or later in the year, the iPhone SE looks to serve consumers who enjoy smaller phones with headphone jacks. That’s me.

Meet the quantum blockchain that works like a time machine

A new – and theoretical – system for blockchain -based data storage could ensure that hackers will not be able to crack cryptocurrencies once we all enter the quantum era. The idea, proposed by researchers at the Victoria University of Wellington in New Zealand, would secure our crypto futures for decades to coming using a blockchain technology that is like a time machine.

You can check out their findings here.

To understand what’s going on here we have to define some terms. A blockchain stores every transaction in a system on what amounts to an immutable record of events. The work necessary for maintaining and confirming this immutable record is what is commonly known as mining . But this technology – which the paper’s co-author Del Rajan claims will make up “10 percent of global GDP… by 2027” – will become insecure in an era of quantum computers.

Therefore the solution is to store a blockchain in a quantum era requires a quantum blockchain using a series of entangled photons. Further, Spectrum writes: “Essentially, current records in a quantum blockchain are not merely linked to a record of the past, but rather a record in the past, one does that not exist anymore.”

Yeah, it’s weird.

From the paper intro:

Our method involves encoding the blockchain into a temporal GHZ (Greenberger–Horne–Zeilinger) state of photons that do not simultaneously coexist. It is shown that the entanglement in time, as opposed to an entanglement in space, provides the crucial quantum advantage. All the subcomponents of this system have already been shown to be experimentally realized. Perhaps more shockingly, our encoding procedure can be interpreted as non-classically influencing the past; hence this decentralized quantum blockchain can be viewed as a quantum networked time machine.

In short the quantum blockchain is immutable because the photons that it contains do not exist in the current time but are still extant and readable. This means you can see the entire blockchain but you cannot “touch” it and the only entry you would be able to try to tamper with is the most recent one. In fact, the researchers write, “In this spatial entanglement case, if an attacker tries to tamper with any photon, the full blockchain would be invalidated immediately.”

Is this really possible? The researchers note that the technology already exists.

“Our novel methodology encodes a blockchain into these temporally entangled states, which can then be integrated into a quantum network for further useful operations. We will also show that entanglement in time, as opposed to entanglement in space, plays the pivotal role for the quantum benefit over a classical blockchain,” the authors write. “As discussed below, all the subsystems of this design have already been shown to be experimentally realized. Furthermore, if such a quantum blockchain were to be constructed, we will show that it could be viewed as a quantum networked time machine.”

Don’t worry about having to update your Bitcoin wallet, though. This process is still very theoretical and not at all available to mere mortals. That said, it’s nice to know someone is looking out for our quantum future, however weird it may be.

Bring on museum companion apps — but only if they’re absolutely awesome

On a weekday afternoon in the Greek and Roman sculpture hall at The Metropolitan Museum of Art, I was, to my surprise, the only visitor on my phone. 

The long gallery was fairly empty, but it was oddly devoid of the usual selfie-takers and Instagram story posters. I, on the other hand, had my phone out because I was attempting to use an app called Smartify to scan “Marble statue of a kouros (youth),” in order to learn more about the sculpture. It wasn’t going well. 

Smartify is one of a handful of apps that allow museum-goers to use computer vision to scan works of art, bringing up supporting media and information. The Google Arts & Culture app also purports to have a version of this technology (although Google couldn’t tell me which museums offer the functionality). Other apps from organizations like the Guggenheim use Bluetooth to serve museum visitors information about artwork. And recently, the global design firm R/GA unveiled the “BG Selects” app for its guest exhibition at Cooper Hewitt, the Smithsonian Design Museum in New York City. The R/GA app not only scans art, but also provides a synced audio experience for video, based on sub-audible sound wave recognition.

These apps are certainly full of potential. A seamless and functional museum companion app could enrich the museum-going experience through transportive AR experiences, un-Google-able supplemental information, or other multimedia. 

Unfortunately, the apps can be clunky, unimaginative, and redundant. They introduce reliance on our phones in one of the last places where we don’t turn to our screens for entertainment. The technology could hinder our ability to appreciate art rather than enhance it.

When it comes to introducing cellphone-based technology into museums, the experiences will live or die on the app-maker’s ability to execute a flawless user experience — and on their imaginations.

I just wanted to LEARN.

I just wanted to LEARN.

Image: screenshot: Mashable/Rachel Kraus

The Bob Greenberg Selects exhibition at Cooper Hewitt was all white walls, impeccable lighting, and well-spaced technological artifacts. But the most curious thing about it was that the art came with no explanatory placards.

BG Selects is the latest installment of Cooper Hewitt’s “Selects” series, for which notable figures in design create a temporary exhibition around their expertise using the museum’s collection, and sometimes their own. 

Bob Greenberg is the beret-wearing co-founder of advertising, consulting, design, and innovation firm R/GA. He is also an avid collector of art and design. Greenberg guest-curated the exhibit to explore “how design and technology have augmented and revolutionized modern human life,” according to R/GA, featuring objects like early fax machines alongside iPhones to draw connections between innovations over time, or what he called a technological “lineage.”

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2018%2f4%2fce37f7cd be43 7444%2fthumb%2f00001

A hallmark of the exhibit was its companion app (available for iOS and Android). When visitors entered the exhibit, they were prompted to download the “BG Selects” app and provided with a free set of headphones. The app came with two capabilities. First, it allowed users to scan the object with their phones. Once the app recognized the art, it would serve companion videos from Greenberg or other design experts explaining the significance of the objects, in an often personal manner.

It also allowed users to hear the audio of videos playing in the exhibit, synced up to the video’s timecode. In most exhibits videos require subtitles, because sound is a no-no. At BG Selects, the videos emit sub-audible soundwaves that the app picks up via phone or headphone microphones. Users are prompted to turn on the sound to be able to hear the audio synced up to where the video is actually playing. 

Both capabilities in the app come to R/GA through its start-up incubator program. The visual component comes through its proprietary image recognition software Clarifai. And the audio is through LISNR, “a communication protocol that uses inaudible sound to broadcast information.”

“There are a lot of opportunities this technology has to really affect our lives, outside of museums, inside of museums,” Michael Piccuirro, R/GA’s senior technology director said. “This is just a very early sort of manifestation of that.”

Who needs museum placards anyway?!

Who needs museum placards anyway?!

Image: courtesy of R/GA

I was shocked by how well the app functioned. Scanning was quick and easy. Once I selected the audio, I didn’t have to look at my phone to get the information. And I was never unable to connect to the video audio when I wanted to — whereas connecting on command is a problem I frequently experience with Bluetooth. In both cases, the phone was more of a portal than the experience itself.

The limitation of the app was not the technology, it was the content. The audio could have been more than the musings of people whose names a layperson doesn’t recognize. Still, listening to the guides nerding out about design history was informative and a nice personal touch.

Other attendees were not as sold on the experience. Some had difficulty getting it to work, others disliked the audio guide format, and some resented the fact that they had to download an app in the first place. One attendee told me he didn’t come to a museum just so he could download another app.

But Cooper Hewitt’s curator, Emily Orr, told me that she didn’t want the app to feel compulsory. Instead, she saw it as enriching and an aesthetic extension of the exhibit itself.

“You can choose how much you want to engage with it,” Orr said. “It’s personal, or can be made personal… We wanted to focus on the objects and keep the exhibition design clean. Using an application like this allows us to do that.”

“We’re showing the future of the museum,” Greenberg said.

One of four technology "lineages."

One of four technology “lineages.”

Image: courtesy of r/ga

Smartify is known as “the Shazam of the art world.” It’s not tied to any museum, but rather functions in several museums around the world, including the Metropolitan Museum of Art.

After it failed to scan any 3D objects in classical sculpture during my visit to the Met, I thought I’d try my luck with some good, old-fashioned paintings. I headed to Medieval Painting, because I’ve never been into the whole haloed baby motif. I wanted to test the idea that learning more about a piece of art could change my experience with it.

Smartify worked well when scanning paintings. There were two options that became available to me when it identified “Capriccio with St. Paul’s and Old London Bride”: information from The Met and biographical information. 

When I clicked on The Met’s information, the text on my smart phone screen was exactly the same as on the painting’s accompanying platform! It didn’t provide anything new. My incredulity increased when I discovered that the biographical information was just Wikipedia text, which I could have just as easily Googled without going to the trouble to download an app. I felt silly and even disrespectful reading from my phone while standing in front of a painting someone had decided was important enough to be hung on the wall of The Met.

Smartify info

Smartify info

Image: screenshot: Mashable/Rachel Kraus

Met placard

Met placard

Image: Rachel Kraus

While wandering through Jesus and Mary Central, disappointed with Smartify, I pushed my way through some glass double doors, and was met with a towering display of horns from around the world. I’d stumbled into The Met’s collection of musical instruments. I dropped my phone back into my bag and wandered around the balcony filled with Victorian pianos, tiny violins, and something that I thought was a traditional Jewish grogger (a noisemaker used while celebrating the holiday Purim) but turned out to be an Italian percussion instrument.

I was reminded of the thing that makes me stop in my tracks at a museum — the reason I love going to them. It doesn’t come from anything supplementary, let alone a smart phone. It’s my individual alchemy, whatever it is about me that makes me delight in the statue of a crone and ignore one of a pharaoh. If I’d been looking at my phone in the medieval hall, I might have missed that experience of learning and discovery … in pursuit of learning and discovery.

But the instruments also reminded me that my phone could have been a powerful tool to ramp up that feeling. If I’d been able to use, say, Clarifai or LISNR technology in the music hall at The Met, maybe I could have heard what an ancient Egyptian horn actually sounded like, or the rich resonance of a wood and ivory piano. 

The horns were definitely not scannable.

The horns were definitely not scannable.

Image: screenshot: mashable/Rachel Kraus

Visiting these exhibits, I was less perturbed by the integration of smart phones into museums than I thought I would be. Having my phone in my hand (admittedly in Do Not Disturb mode) didn’t take away from my ability to turn a corner and be blown away by an unexpected expression hidden in a portrait. 

What worried me was when the apps fell short, both in the case of usability and in value.

“Certainly at the Met, we don’t want people thinking about technology,” Loic Tallon, the digital director of the Met, told me. “We want people thinking about the art. If that piece of content’s going to help them experience the art, and have a better understanding of it, then I want to get it to them in the most seamless way possible.”

That seamlessness is key. If an app doesn’t scan, freezes, or dies, it becomes an impediment to the museum experience. Something else on our phones aggravating us is the last thing we need in a museum.

If an app does work well, the content has really got to deliver — beyond what a wall label or even an audio guide can do.

“For me, someone looking at their phone screens to get the information vs. the label, there isn’t a huge amount of difference,” Tallon said. “For me, the greatest challenge is actually the more global societal challenge which we currently face: that people’s attention is being dramatically reduced by these technologies. How we use phones is dramatically changing how our brains work. And that’s where I see the impact of these technologies, on people’s ability to stand in front of an artwork and spend more time with it.”

Museum companion apps could actually combat that restlessness by engaging us in a way that static art, linear time, can’t. I recently learned about an Augmented Reality exhibit of the biggest museum art heist in history. But to my disappointment, the AR component was just showing the art back in the frames where it once was. I could have Googled that!

What if the AR took us inside the heist, showed us how the burglars pulled off the crime, transported us to the struggle to rip a piece of art out of its frame without damaging it?

What if, at the Met, an AR app could show the process of creating the painting in front of you? Or at Cooper Hewitt, if the audio guide was narrated by children seeing the retro objects for the first time? Or featured original news reports about the objects when they were created? The only thing keeping these supplemental apps from being truly valuable is the imaginative limitations of their creators. 

Function is good, but not everything.

Function is good, but not everything.

Image: courtesy of R/GA

“No matter if a technology changes, the questions you’re asking about the impact on the experience have been long-standing,” Tallon said. “How technology impacts the experience of art — that larger question has been something that institutions have looked at for over 60 years.”

It’s important that we keep asking these questions before introducing technology for the sake of technology into these blessedly screen-free spaces. But we can’t let skepticism or convention limit us, either, because the potential for technology to broaden our minds and deepen our emotional connects to art is too great.

So, museums, bring on the apps. But you’d better make sure they’re freaking awesome.

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f82445%2f9e0aa920 785e 4de0 97f7 d344b24ae120