All posts in “Hardware”

TCL leaks foretell a weird future for foldable phones

Foldables are going to get weird. And I’m here for it. Just check out these leaked TCL renders from CNET. All manner of strange and wonderful folding devices — two tablets and three smartphones, including one that flips all the way around into a Futurama-style bracelet. There are renders for tablets and phones that fold both in and out.

Granted, few if any will actually come to fruition, but if this first wave of foldables opens up smartphone design in new and interesting ways like these, the industry will be all the better for it. Of course, we’re still in the early stages of all of this — and the first wave of foldable have yet to prove themselves of interest to the smartphone buying audience beyond simple novelties.

We’ll be seeing a fair bit more of the space week at Mobile World Congress, along with Wednesday’s Samsung event, which is expected to give us another peek at the upcoming Galaxy foldable. For now, however, the Royale FlexPai is the only device that’s actually come to market, and that one still feels like little more than a developer product.

However, while TCL’s not a household name here in the space, the Chinese company certainly has experience in the display department, both through its TV business of the same name and smartphone brands like Alcatel, Palm and BlackBerry.

These sorts of renders are probably pretty standard for all companies currently experimenting with a flexible form factor. If there’s one thing all of the announced devices have proven, it’s that the industry is still a ways away from settling on a consistent design language for these devices. And it’s certainly possible that the industry will never settle on a consistent form factor

Apple acquires talking Barbie voicetech startup PullString

Apple has just bought up the talent it needs to make talking toys a part of Siri, HomePod, and its voice strategy. Apple has acquired PullString, also known as ToyTalk, according to Axios’ Dan Primack and Ina Fried. The company makes voice experience design tools, artificial intelligence to power those experiences, and toys like talking Barbie and Thomas The Tank Engine toys in partnership with Mattel. Founded in 2011 by former Pixar executives, PullString went on to raise $44 million.

Apple’s Siri is seen as lagging far behind Amazon Alexa and Google Assistant, not only in voice recognition and utility, but also in terms of developer ecosystem. Google and Amazon has built platforms to distribute Skills from tons of voice app makers, including storytelling, quizzes, and other games for kids. If Apple wants to take a real shot at becoming the center of your connected living room with Siri and HomePod, it will need to play nice with the children who spend their time there. Buying PullString could jumpstart Apple’s in-house catalog of speech-activated toys for kids as well as beef up its tools for voice developers.

PullString did catch some flack for being a “child surveillance device” back in 2015, but countered by detailing the security built intoHello Barbie product and saying it’d never been hacked to steal childrens’ voice recordings or other sensitive info. Privacy norms have changed since with so many people readily buying always-listening Echos and Google Homes.

We’ve reached out to Apple and PullString for more details about whether PullString and ToyTalk’s products will remain available. .

The startup raised its cash from investors including Khosla Ventures, CRV, Greylock, First Round, and True Ventures, with a Series D in 2016 as its last raise that PitchBook says valued the startup at $160 million. While the voicetech space has since exploded, it can still be difficult for voice experience developers to earn money without accompanying physical products, and many enterprises still aren’t sure what to build with tools like those offered by PullString. That might have led the startup to see a brighter future with Apple, strengthening one of the most ubiquitous though also most detested voice assistants.

Deploy the space harpoon

Watch out, starwhales. There’s a new weapon for the interstellar dwellers whom you threaten with your planet-crushing gigaflippers, undergoing testing as we speak. This small-scale version may only be good for removing dangerous orbital debris, but in time it will pierce your hypercarbon hides and irredeemable sun-hearts.

Literally a space harpoon. (Credit: Airbus)

However, it would be irresponsible of me to speculate beyond what is possible today with the technology, so let a summary of the harpoon’s present capabilities suffice.

The space harpoon is part of the RemoveDEBRIS project, a multi-organization European effort to create and test methods of reducing space debris. There are thousands of little pieces of who knows what clogging up our orbital neighborhood, ranging in size from microscopic to potentially catastrophic.

There are as many ways to take down these rogue items as there are sizes and shapes of space junk; perhaps it’s enough to use a laser to edge a small piece down towards orbital decay, but larger items require more hands-on solutions. And seemingly all nautical in origin: RemoveDEBRIS has a net, a sail, and a harpoon. No cannon?

You can see how the three items are meant to operate here:

[embedded content]

The harpoon is meant for larger targets, for example full-size satellites that have malfunctioned and are drifting from their orbit. A simple mass driver could knock them towards the Earth, but capturing them and controlling descent is a more controlled technique.

While an ordinary harpoon would simply be hurled by the likes of Queequeg or Dagoo, in space it’s a bit different. Sadly it’s impractical to suit up a harpooner for EVA missions. So the whole thing has to be automated. Fortunately the organization is also testing computer vision systems that can identify and track targets. From there it’s just a matter of firing the harpoon at it and reeling it in, which is what the satellite demonstrated today.

This Airbus-designed little item is much like a toggling harpoon, which has a piece that flips out once it pierces the target. Obviously it’s a single-use device, but it’s not particularly large and several could be deployed on different interception orbits at once. Once reeled in, a drag sail (seen in the video above) could be deployed to accelerate reentry. The whole thing could be done with little or no propellant, which greatly simplifies operation.

Obviously it’s not yet a threat to the starwhales. But we’ll get there. We’ll get those monsters good one day.

Asteroid is building a human-machine interaction engine for AR developers

When we interact with computers today we move the mouse, we scroll the trackpad, we tap the screen, but there is so much that the machines don’t pick up on, what about where we’re looking, the subtle gestures we make and what we’re thinking?

Asteroid is looking to get developers comfortable with the idea that future interfaces are going to take in much more biosensory data. The team has built a node-based human-machine interface engine for macOS and iOS that allows developers to build interactions that can be imported into Swift applications.

“What’s interesting about emerging human-machine interface tech is the hope that the user may be able to “upload” as much as they can “download” today,”Asteroid founder Saku Panditharatne wrote in a Medium post.

To bring attention to their development environment, they’ve launched a crowdfunding campaign that gives a decent snapshot of the depth of experiences that can be enabled by today’s commercially available biosensors. Asteroid definitely doesn’t want to be a hardware startup, but their campaign is largely serving as a way to just expose developers to what tools could be in their interaction design arsenal.

There are dev kits and then there are dev kits, and this is a dev kit. Developers jumping on board for the total package get a bunch of open hardware, i.e. a bunch of gear and cases to build out hacked together interface solutions. The $450 kit brings capabilities like eye-tracking, brain-computer interface electrodes, and some gear to piece together a motion controller. Backers can also just buy the $200 eye-tracking kit alone. It’s all very utility-minded and clearly not designed to make Asteroid those big hardware bucks.

“The long-term goal is to support as much AR hardware as we can, we just made our own kit because I don’t think there is that much good stuff out there outside of labs,” Panditharatne told TechCrunch.

The crazy hardware seems to be a bit of a labor of love for the time being, while a couple AR/VR devices have eye-tracking baked-in, it’s still a generation away from most consumer VR devices and you’re certainly not going to find too much hardware with brain-computer interface systems built-in. The startup says their engine will do plenty with just a smartphone camera and a microphone, but the broader sell with the dev kit is that you’re not building for a specific piece of hardware, you’re experimenting on the bet that interfaces are going to grow more closely intertwined with how we process the world as humans.

Panditharatne founded the company after stints at Oculus and Andreessen Horowitz where she spent a lot of time focusing on the future of AR and VR. Panditharatne tells us that Asteroid has raised over $2 million in funding already but that they’re not detailing the sources of that cash quite yet.

The company is looking to raise $20k from their Indiegogo but the platform is the clear sell here, exposing people to their human-machine interaction engine. Asteroid is taking sign-ups to join the waiting list for the product on their site.