All posts in “apple inc”

Google’s latest hardware innovation: Price

With its latest consumer hardware products, Google’s prices are undercutting Apple, Samsung, and Amazon. The search giant just unveiled its latest flagship smartphone, tablet, and smart home device and all available at prices well below their direct competitors. Where Apple and Samsung are pushing prices of its latest products even higher, Google is seemingly happy to keep prices low and this is creating a distinct advantage for the company’s products.

Google, like Amazon and nearly Apple, is a services company that happens to sell hardware. It needs to acquire users through multiple verticals including hardware. Somewhere, deep in the Googleplex, a team of number crunchers decided it made more sense to make its hardware prices dramatically lower than competitors. If Google is taking a loss on the hardware, it is likely making it back through services.

Amazon does this with Kindle devices. Microsoft and Sony do it with game consoles. This is a proven strategy to increase market share where the revenue generated on the backend recovers the revenue lost on selling hardware with slim or negative margins.

Look at the Pixel 3. The base 64GB model is available for $799 while the base 64GB iPhone XS is $999. Want a bigger screen? The 64GB Pixel 3 XL is $899, and the 64GB iPhone XS Max is $1099. Regarding the specs, both phones offer OLED displays and amazing cameras. There are likely pros and cons regarding the speed of the SoC, amount of RAM and wireless capabilities. Will consumers care since the screen and camera are so similar? Probably not.

Google also announced the Home Hub today. Like the Echo Show, it’s designed to be the central part of a smart home. It puts Google Assistant on a fixed screen where users can ask it questions and control a smart home. It’s $149. That’s $80 less than the Echo Show thou the Google version lacks video conferencing and a dedicated smart home hub — the Google Home Hub requires extra hardware for some smart home objects. Still, even with fewer features, the Home Hub is compelling because of its drastically lower price. For just a few dollars more than an Echo Show, a buyer could get a Home Hub and two Home Mini’s.

The Google Pixel Slate is Google’s answer to the iPad Pro. From everything we’ve seen, it appears to lack a lot of the processing power found in Apple’s top tablet. It doesn’t seem as refined or capable of specific tasks. But for view media, creating content and playing games, it feels just fine. It even has a Pixelbook Pen and a great keyboard that shows Google is positioning this against the iPad Pro. And the 12.3-inch Pixel Slate is available for $599 where the 12.9-inch iPad Pro is $799.

The upfront price is just part of the equation. When considering the resale value of these devices, a different conclusion can be reached. Apple products consistently resale for more money than Google products. On Gazelle.com, a company that buys used smartphones, a used iPhone X is worth $425 where a used Pixel 2 is $195. A used iPhone 8, a phone that sold for a price closer to the Pixel 2, is worth $240.

In the end, Google likely doesn’t expect to make money off the hardware it sells. It needs users to buy into its services. The best way to do that is to make the ecosystem competitive though perhaps not investing the capital to make it the best. It needs to be just good enough, and that’s how I would describe these devices. Good enough to be competitive on a spec-to-spec basis while available for much less.

more Google Event 2018 coverage

The Das Keyboard 5Q adds IoT to your I/O keys

Just when you thought you were safe from IoT on your keyboard Das Keyboard has come out with the 5Q, a smart keyboard that can send you notifications and change colors based on the app you’re using.

These kinds of keyboards aren’t particularly new – you can find gaming keyboards that light up all the colors of the rainbow. But the 5Q is almost completely programmable and you can connect to the automation services IFTTT or Zapier. This means you can do things like blink the Space Bar red when someone passes your Nest camera or blink the Tab key white when the outdoor temperature falls below 40 degrees.

You can also make a key blink when someone Tweets which could be helpful or frustrating:

The $249 keyboard is delightfully rugged and the switches – called Gamma Zulu and made by Das Keyboard – are nicely clicky but not too loud. The keys have a bit of softness to them at the half-way point so if you’re used to Cherry-style keyboards you might notice a difference here. That said the keys are rated for 100 million actuations, far more than any competing switch. The RGB LEDs in each key, as you can see below, are very bright and visible but when the keys lights are all off the keyboard is completely unreadable. This, depending on your desire to be Case from Neuromancer, is a feature or a bug. There is also a media control knob in the top right corner that brings up the Q app when pressed.

The entire package is nicely designed but the 5Q begs the question: do you really need a keyboard that can notify you when you get a new email? The Mac version of the software is also a bit buggy right now but they are updating it constantly and I was able to install it and run it without issue. Weird things sometimes happen, however. For example currently my Escape and F1 keys are now blinking red and I don’t know how to turn them off.

That said, Das Keyboard makes great keyboards. They’re my absolute favorite in terms of form factor and key quality and if you need a keyboard that can notify you when a cryptocurrency goes above a certain point or your Tesla stock is about to tank, look no further than the 5Q. It’s a keyboard for hackers by hackers and, as you can see below, the color transitions are truly mesmerizing.

Review: iPhone XS, XS Max and the power of long-term thinking

The iPhone XS proves one thing definitively: that the iPhone X was probably one of the most ambitious product bets of all time.

When Apple told me in 2017 that they put aside plans for the iterative upgrade that they were going to ship and went all in on the iPhone X because they thought they could jump ahead a year, they were not blustering. That the iPhone XS feels, at least on the surface, like one of Apple’s most “S” models ever is a testament to how aggressive the iPhone X timeline was.

I think there will be plenty of people who will see this as a weakness of the iPhone XS, and I can understand their point of view. There are about a half-dozen definitive improvements in the XS over the iPhone X, but none of them has quite the buzzword-worthy effectiveness of a marquee upgrade like 64-bit, 3D Touch or wireless charging — all benefits delivered in previous “S” years.

That weakness, however, is only really present if you view it through the eyes of the year-over-year upgrader. As an upgrade over an iPhone X, I’d say you’re going to have to love what they’ve done with the camera to want to make the jump. As a move from any other device, it’s a huge win and you’re going head-first into sculpted OLED screens, face recognition and super durable gesture-first interfaces and a bunch of other genre-defining moves that Apple made in 2017, thinking about 2030, while you were sitting back there in 2016.

Since I do not have an iPhone XR, I can’t really make a call for you on that comparison, but from what I saw at the event and from what I know about the tech in the iPhone XS and XS Max from using them over the past week, I have some basic theories about how it will stack up.

For those with interest in the edge of the envelope, however, there is a lot to absorb in these two new phones, separated only by size. Once you begin to unpack the technological advancements behind each of the upgrades in the XS, you begin to understand the real competitive edge and competence of Apple’s silicon team, and how well they listen to what the software side needs now and in the future.

Whether that makes any difference for you day to day is another question, one that, as I mentioned above, really lands on how much you like the camera.

But first, let’s walk through some other interesting new stuff.

Notes on durability

As is always true with my testing methodology, I treat this as anyone would who got a new iPhone and loaded an iCloud backup onto it. Plenty of other sites will do clean room testing if you like comparison porn, but I really don’t think that does most folks much good. By and large most people aren’t making choices between ecosystems based on one spec or another. Instead, I try to take them along on prototypical daily carries, whether to work for TechCrunch, on vacation or doing family stuff. A foot injury precluded any theme parks this year (plus, I don’t like to be predictable) so I did some office work, road travel in the center of California and some family outings to the park and zoo. A mix of uses cases that involves CarPlay, navigation, photos and general use in a suburban environment.

In terms of testing locale, Fresno may not be the most metropolitan city, but it’s got some interesting conditions that set it apart from the cities where most of the iPhones are going to end up being tested. Network conditions are pretty adverse in a lot of places, for one. There’s a lot of farmland and undeveloped acreage and not all of it is covered well by wireless carriers. Then there’s the heat. Most of the year it’s above 90 degrees Fahrenheit and a good chunk of that is spent above 100. That means that batteries take an absolute beating here and often perform worse than other, more temperate, places like San Francisco. I think that’s true of a lot of places where iPhones get used, but not so much the places where they get reviewed.

That said, battery life has been hard to judge. In my rundown tests, the iPhone XS Max clearly went beast mode, outlasting my iPhone X and iPhone XS. Between those two, though, it was tougher to tell. I try to wait until the end of the period I have to test the phones to do battery stuff so that background indexing doesn’t affect the numbers. In my ‘real world’ testing in the 90+ degree heat around here, iPhone XS did best my iPhone X by a few percentage points, which is what Apple does claim, but my X is also a year old. The battery didn’t fail during even intense days of testing with the XS.

In terms of storage I’m tapping at the door of 256GB, so the addition of 512GB option is really nice. As always, the easiest way to determine what size you should buy is to check your existing free space. If you’re using around 50% of what your phone currently has, buy the same size. If you’re using more, consider upgrading because these phones are only getting faster at taking better pictures and video and that will eat up more space.

The review units I was given both had the new gold finish. As I mentioned on the day, this is a much deeper, brassier gold than the Apple Watch Edition. It’s less ‘pawn shop gold’ and more ‘this is very expensive’ gold. I like it a lot, though it is hard to photograph accurately — if you’re skeptical, try to see it in person. It has a touch of pink added in, especially as you look at the back glass along with the metal bands around the edges. The back glass has a pearlescent look now as well, and we were told that this is a new formulation that Apple created specifically with Corning. Apple says that this is the most durable glass ever in a smartphone.

My current iPhone has held up to multiple falls over 3 feet over the past year, one of which resulted in a broken screen and replacement under warranty. Doubtless multiple YouTubers will be hitting this thing with hammers and dropping it from buildings in beautiful Phantom Flex slo-mo soon enough. I didn’t test it. One thing I am interested in seeing develop, however, is how the glass holds up to fine abrasions and scratches over time.

My iPhone X is riddled with scratches both front and back, something having to do with the glass formulation being harder, but more brittle. Less likely to break on impact but more prone to abrasion. I’m a dedicated no-caser, which is why my phone looks like it does, but there’s no way for me to tell how the iPhone XS and XS Max will hold up without giving them more time on the clock. So I’ll return to this in a few weeks.

Both the gold and space grey iPhones XS have been subjected to a coating process called physical vapor deposition or PVD. Basically metal particles get vaporized and bonded to the surface to coat and color the band. PVD is a process, not a material, so I’m not sure what they’re actually coating these with, but one suggestion has been Titanium Nitride. I don’t mind the weathering that has happened on my iPhone X band, but I think it would look a lot worse on the gold, so I’m hoping that this process (which is known to be incredibly durable and used in machine tooling) will improve the durability of the band. That said, I know most people are not no-casers like me so it’s likely a moot point.

Now let’s get to the nut of it: the camera.

Bokeh let’s do it

I’m (still) not going to be comparing the iPhone XS to an interchangeable lens camera because portrait mode is not a replacement for those, it’s about pulling them out less. That said, this is closest its ever been.

One of the major hurdles that smartphone cameras have had to overcome in their comparisons to cameras with beautiful glass attached is their inherent depth of focus. Without getting too into the weeds (feel free to read this for more), because they’re so small, smartphone cameras produce an incredibly compressed image that makes everything sharp. This doesn’t feel like a portrait or well composed shot from a larger camera because it doesn’t produce background blur. That blur was added a couple of years ago with Apple’s portrait mode and has been duplicated since by every manufacturer that matters — to varying levels of success or failure.

By and large, most manufacturers do it in software. They figure out what the subject probably is, use image recognition to see the eyes/nose/mouth triangle is, build a quick matte and blur everything else. Apple does more by adding the parallax of two lenses OR the IR projector of the TrueDepth array that enables Face ID to gather a multi-layer depth map.

As a note, the iPhone XR works differently, and with fewer tools, to enable portrait mode. Because it only has one lens it uses focus pixels and segmentation masking to ‘fake’ the parallax of two lenses.

With the iPhone XS, Apple is continuing to push ahead with the complexity of its modeling for the portrait mode. The relatively straightforward disc blur of the past is being replaced by a true bokeh effect.

Background blur in an image is related directly to lens compression, subject-to-camera distance and aperture. Bokeh is the character of that blur. It’s more than just ‘how blurry’, it’s the shapes produced from light sources, the way they change throughout the frame from center to edges, how they diffuse color and how they interact with the sharp portions of the image.

Bokeh is to blur what seasoning is to a good meal. Unless you’re the chef, you probably don’t care what they did you just care that it tastes great.

Well, Apple chef-ed it the hell up with this. Unwilling to settle for a templatized bokeh that felt good and leave it that, the camera team went the extra mile and created an algorithmic model that contains virtual ‘characteristics’ of the iPhone XS’s lens. Just as a photographer might pick one lens or another for a particular effect, the camera team built out the bokeh model after testing a multitude of lenses from all of the classic camera systems.

I keep saying model because it’s important to emphasize that this is a living construct. The blur you get will look different from image to image, at different distances and in different lighting conditions, but it will stay true to the nature of the virtual lens. Apple’s bokeh has a medium-sized penumbra, spreading out light sources but not blowing them out. It maintains color nicely, making sure that the quality of light isn’t obscured like it is with so many other portrait applications in other phones that just pick a spot and create a circle of standard gaussian or disc blur.

Check out these two images, for instance. Note that when the light is circular, it retains its shape, as does the rectangular light. It is softened and blurred, as it would when diffusing through the widened aperture of a regular lens. The same goes with other shapes in reflected light scenarios.

Now here’s the same shot from an iPhone X, note the indiscriminate blur of the light. This modeling effort is why I’m glad that the adjustment slider proudly carries f-stop or aperture measurements. This is what this image would look like at a given aperture, rather than a 0-100 scale. It’s very well done and, because it’s modeled, it can be improved over time. My hope is that eventually, developers will be able to plug in their own numbers to “add lenses” to a user’s kit.

And an adjustable depth of focus isn’t just good for blurring, it’s also good for un-blurring. This portrait mode selfie placed my son in the blurry zone because it focused on my face. Sure, I could turn the portrait mode off on an iPhone X and get everything sharp, but now I can choose to “add” him to the in-focus area while still leaving the background blurry. Super cool feature I think is going to get a lot of use.

It’s also great for removing unwanted people or things from the background by cranking up the blur.

And yes, it works on non-humans.

If you end up with an iPhone XS, I’d play with the feature a bunch to get used to what a super wide aperture lens feels like. When its open all the way to f1.4 (not the actual widest aperture of the lens, by the way; this is the virtual model we’re controlling) pretty much only the eyes should be in focus. Ears, shoulders, maybe even nose could be out of the focus area. It takes some getting used to but can produce dramatic results.

A 150% crop of a larger photo to show detail preservation.

Developers do have access to one new feature though, the segmentation mask. This is a more precise mask that aids in edge detailing, improving hair and fine line detail around the edges of a portrait subject. In my testing it has led to better handling of these transition areas and less clumsiness. It’s still not perfect, but it’s better. And third-party apps like Halide are already utilizing it. Halide’s co-creator, Ben Sandofsky, says they’re already seeing improvements in Halide with the segmentation map.

“Segmentation is the ability to classify sets of pixels into different categories,” says Sandofsky. “This is different than a “Hot dog, not a hot dog” problem, which just tells you whether a hot dog exists anywhere in the image. With segmentation, the goal is drawing an outline over just the hot dog. It’s an important topic with self driving cars, because it isn’t enough to tell you there’s a person somewhere in the image. It needs to know that person is directly in front of you. On devices that support it, we use PEM as the authority for what should stay in focus. We still use the classic method on old devices (anything earlier than iPhone 8), but the quality difference is huge.”

The above is an example shot in Halide that shows the image, the depth map and the segmentation map.

In the example below, the middle black-and-white image is what was possible before iOS 12. Using a handful of rules like, “Where did the user tap in the image?” We constructed this matte to apply our blur effect. It’s no bad by any means, but compare it to the image on the right. For starters, it’s much higher resolution, which means the edges look natural.

My testing of portrait mode on the iPhone XS says that it is massively improved,  but that there are still some very evident quirks that will lead to weirdness in some shots like wrong things made blurry and halos of light appearing around subjects. It’s also not quite aggressive enough on foreground objects — those should blur too but only sometimes do. But the quirks are overshadowed by the super cool addition of the adjustable background blur. If conditions are right it blows you away. But every once in a while you still get this sense like the Neural Engine just threw up its hands and shrugged.

Live preview of the depth control in the camera view is not in iOS 12 at the launch of the iPhone XS, but it will be coming in a future version of iOS 12 this fall.

I also shoot a huge amount of photos with the telephoto lens. It’s closer to what you’d consider to be a standard lens on a camera. The normal lens is really wide and once you acclimate to the telephoto you’re left wondering why you have a bunch of pictures of people in the middle of a ton of foreground and sky. If you haven’t already, I’d say try defaulting to 2x for a couple of weeks and see how you like your photos. For those tight conditions or really broad landscapes you can always drop it back to the wide. Because of this, any iPhone that doesn’t have a telephoto is a basic non-starter for me, which is going to be one of the limiters on people moving to iPhone XR from iPhone X, I believe. Even iPhone 8 Plus users who rely on the telephoto I believe will miss it if they don’t go to the XS.

But, man, Smart HDR is where it’s at

I’m going to say something now that is surely going to cause some Apple followers to snort, but it’s true. Here it is:

For a company as prone to hyperbole and Maximum Force Enthusiasm about its products, I think that they have dramatically undersold how much improved photos are from the iPhone X to the iPhone XS. It’s extreme, and it has to do with a technique Apple calls Smart HDR.

Smart HDR on the iPhone XS encompasses a bundle of techniques and technology including highlight recovery, rapid-firing the sensor, an OLED screen with much improved dynamic range and the Neural Engine/image signal processor combo. It’s now running faster sensors and offloading some of the work to the CPU, which enables firing off nearly two images for every one it used to in order to make sure that motion does not create ghosting in HDR images, it’s picking the sharpest image and merging the other frames into it in a smarter way and applying tone mapping that produces more even exposure and color in the roughest of lighting conditions.

iPhone XS shot, better range of tones, skintone and black point

iPhone X Shot, not a bad image at all, but blocking up of shadow detail, flatter skin tone and blue shift

Nearly every image you shoot on an iPhone XS or iPhone XS Max will have HDR applied to it. It does it so much that Apple has stopped labeling most images with HDR at all. There’s still a toggle to turn Smart HDR off if you wish, but by default it will trigger any time it feels it’s needed.

And that includes more types of shots that could not benefit from HDR before. Panoramic shots, for instance, as well as burst shots, low light photos and every frame of Live Photos is now processed.

The results for me have been massively improved quick snaps with no thought given to exposure or adjustments due to poor lighting. Your camera roll as a whole will just suddenly start looking like you’re a better picture taker, with no intervention from you. All of this is capped off by the fact that the OLED screens in the iPhone XS and XS Max have a significantly improved ability to display a range of color and brightness. So images will just plain look better on the wider gamut screen, which can display more of the P3 color space.

Under the hood

As far as Face ID goes, there has been no perceivable difference for me in speed or number of positives, but my facial model has been training on my iPhone X for a year. It’s starting fresh on iPhone XS. And I’ve always been lucky that Face ID has just worked for me most of the time. The gist of the improvements here are jumps in acquisition times and confirmation of the map to pattern match. There is also supposed to be improvements in off-angle recognition of your face, say when lying down or when your phone is flat on a desk. I tried a lot of different positions here and could never really definitively say that iPhone XS was better in this regard, though as I said above, it very likely takes training time to get it near the confidence levels that my iPhone X has stored away.

In terms of CPU performance the world’s first at-scale 7nm architecture has paid dividends. You can see from the iPhone XS benchmarks that it compares favorably to fast laptops and easily exceeds iPhone X performance.

[embedded content]

The Neural Engine and better A12 chip has meant for better frame rates in intense games and AR, image searches, some small improvement in app launches. One easy way to demonstrate this is the video from the iScape app, captured on an iPhone X and an iPhone XS. You can see how jerky and FPS challenged the iPhone X is in a similar AR scenario. There is so much more overhead for AR experiences I know developers are going to be salivating for what they can do here.

The stereo sound is impressive, surpassingly decent separation for a phone and definitely louder. The tradeoff is that you get asymmetrical speaker grills so if that kind of thing annoys you you’re welcome.

Upgrade or no

Every other year for the iPhone I see and hear the same things — that the middle years are unimpressive and not worthy of upgrading. And I get it, money matters, phones are our primary computer and we want the best bang for our buck. This year, as I mentioned at the outset, the iPhone X has created its own little pocket of uncertainty by still feeling a bit ahead of its time.

I don’t kid myself into thinking that we’re going to have an honest discussion about whether you want to upgrade from the iPhone X to iPhone XS or not. You’re either going to do it because you want to or you’re not going to do it because you don’t feel it’s a big enough improvement.

And I think Apple is completely fine with that because iPhone XS really isn’t targeted at iPhone X users at all, it’s targeted at the millions of people who are not on a gesture-first device that has Face ID. I’ve never been one to recommend someone upgrade every year anyway. Every two years is more than fine for most folks — unless you want the best camera, then do it.

And, given that Apple’s fairly bold talk about making sure that iPhones last as long as they can, I think that it is well into the era where it is planning on having a massive installed user base that rents iPhones from it on a monthly or yearly or biennial period. And it doesn’t care whether those phones are on their first, second or third owner, because that user base will need for-pay services that Apple can provide. And it seems to be moving in that direction already, with phones as old as the five-year-old iPhone 5s still getting iOS updates.

With the iPhone XS, we might just be seeing the true beginning of the iPhone-as-a-service era.

The iPhone XR shows Apple admitting 3D Touch is a failure

Remember 3D Touch? Unless you’re a power iOS user you probably don’t. Or, well, you’d rather not. It’s been clear for some time now that the technology Apple lauded at its 2015 unveiling as the “next generation of multi-touch” most certainly wasn’t. For the mainstream iPhone user it’s just that annoying thing that gets in the way of what you’re actually trying to do.

What Apple actually made with 3D Touch is the keyboard shortcut of multi-touch. Aka a secret weapon for nerds only.

Pro geeks might be endlessly delighted about being able to learn the secrets of its hidden depths, and shave all-important microseconds off of their highly nuanced workflows. But everyone else ignores it.

Or at least tries to ignore it — until, in the middle of trying to do something important they accidentally trigger it and get confused and annoyed about what their phone is trying to do to them.

Tech veterans might recall that BlackBerry (remember them?!) tried something similarly misplaced a decade ago on one of its handsets — unboxing an unlovely (and unloved) clickable touchscreen, in the one-off weirdo BlackBerry Storm.

The Storm didn’t have the iconic physical BlackBerry keyboard but did have a touchscreen with on-screen qwerty keys you could still click. In short, madness!

Safe to say, no usage storms resulted then either — unless you’re talking about the storm of BlackBerry buyers returning to the shop demanding a replacement handset.

In Apple’s case, the misstep is hardly on that level. But three years on from unveiling 3D Touch, it’s now ‘fessing up to its own feature failure — as the latest iPhone line-up drops the pressure-sensing technology entirely from the cheapest of the trio: The iPhone XR.

The lack of 3D Touch on the XR will help shave off some manufacturing cost and maybe a little thickness from the device. Mostly though it shows Apple recognizing it expended a lot of engineering effort to make something most iPhone users don’t use and don’t want to use — given, as TC’s Brian Heater has called it, the iPhone XR is the iPhone for the rest of us.

It isn’t a budget handset, though. The XR does pack Apple’s next-gen biometric technology, Face ID, for instance, so contains a package of sophisticated sensor hardware lodged in its own top notch.

That shows Apple is not cheaping out here. Rather it’s making selective feature decisions based on what it believes iPhone users want and need. So the clear calculation in Cupertino is lots of iPhone users simply don’t need 3D Touch.

At the same time, company execs heaped praise on Face ID at its event this week, saying the technology has proved wildly popular with users. Yet they glossed over the simultaneous depreciation of 3D Touch at the end of the iPhone line without a word of explanation.

Compare the two technologies and it’s easy to see why.

Face ID’s popularity is hardly surprising. It’s hard to think of a simpler interaction than a look that unlocks.

Not so fiddly 3D Touch — which requires a press that’s more than a tap and kind of akin to a push or a little shove. Push too softly and you’ll get a tap which takes you somewhere you weren’t trying to go. But go in too hard from the start and the touchscreen starts to feel like work and/or wasted effort.

On top of that the sought for utility can itself feel pointless — with, for example, content previews that can be horribly slow to load, so why not just tap and look at the email in the first place?

With all the fingering and faffing around 3D Touch is like the Goldilocks of user interfaces: Frustration is all but guaranteed unless you have an awful lot of patience to keep going and going until you get it just right. And who, but power users, can be bothered with that?

For the ‘everyman’ iPhone XR, Apple has swapped 3D Touch for a haptic feedback feature (forgettably named Haptic Touch) — that’s presumably mostly intended to be a sticking plaster to smooth out any fragmentation cracks across the iPhone estate, i.e. in the rare instances where developers have made use of 3D Touch to create in-app shortcuts that people do actually want to use.

If, as we’ve suggested, the iPhone XR ends up being the iPhone that ships in serious quantities there will soon be millions of iOS users without access to 3D Touch at all. So Apple is relegating the technology it once called the future of multi-touch to what it really was: An add-on power feature for pro users.

Pro users are also the people most likely to be willing to spend the biggest bucks on an iPhone — and so will happily shell out to own the iPhone XS or XS Max (which do retain 3D Touch, at least for now).

So while 3D Touch might keep incrementally helping to shift a few extra premium iPhones at the top of the range, it isn’t going to be shifting any paradigms.

Multitouch — combined with generous screen real estate — has been more than good enough on that front.

Apple defends decision not to remove InfoWars’ app

Apple has commented on its decision to continue to allow conspiracy theorist profiteer InfoWars to livestream video podcasts via an app in its App Store, despite removing links to all but one of Alex Jones’ podcast content from its iTunes and podcast apps earlier this week.

At the time Apple said the podcasts had violated its community standards, emphasizing that it “does not tolerate hate speech”, and saying: “We believe in representing a wide range of views, so long as people are respectful to those with differing opinions.”

Yet the InfoWars app allows iOS users to livestream the same content Apple just pulled from iTunes.

In a statement given to BuzzFeed News Apple explains its decision not to pull InfoWars app’ — saying:

We strongly support all points of view being represented on the App Store, as long as the apps are respectful to users with differing opinions, and follow our clear guidelines, ensuring the App Store is a safe marketplace for all. We continue to monitor apps for violations of our guidelines and if we find content that violates our guidelines and is harmful to users we will remove those apps from the store as we have done previously.

Multiple tech platforms have moved to close to door or limit Jones’ reach on their platforms in recent weeks, including Google, which shuttered his YouTube channel, and Facebook, which removed a series of videos and banned Jones’ personal account for 30 days as well as issuing the InfoWars page with a warning strike. Spotify, Pinterest, LinkedIn, MailChimp and others have also taken action.

Although Twitter has not banned or otherwise censured Jones — despite InfoWars’ continued presence on its platform threatening CEO Jack Dorsey’s claimed push to want to improve conversational health on his platform. Snapchat is also merely monitoring Jones’ continued presence on its platform.

In an unsurprising twist, the additional exposure Jones/InfoWars has gained as a result of news coverage of the various platform bans appears to have given his apps some passing uplift…

So Apple’s decision to remove links to Jones’ podcasts yet allow the InfoWars app looks contradictory.

The company is certainly treading a fine line here. But there’s a technical distinction between a link to a podcast in a directory, where podcast makers can freely list their stuff (with the content hosted elsewhere), vs an app in Apple’s App Store which has gone through Apple’s review process and the content is being hosted by Apple.

When it removed Jones’ podcasts Apple was, in effect, just removing a pointer to the content, not the content itself. The podcasts also represented discrete content — meaning each episode which was being pointed to could be judged against Apple’s community standards. (And one podcast link was not removed, for example, though five were.)

Whereas Jones (mostly) uses the InfoWars app to livestream podcast shows. Meaning the content in the InfoWars app is more ephemeral — making it more difficult for Apple to cross-check against its community standards. The streamer has to be caught in the act, as it were.

Google has also not pulled the InfoWars app from its Play Store despite shuttering Jones’ YouTube channel, and a spokesperson told BuzzFeed: “We carefully review content on our platforms and products for violations of our terms and conditions, or our content policies. If an app or user violates these, we take action.”

That said, both the iOS and Android versions of the app also include ‘articles’ that can be saved by users, so some of the content appears to be less ephemeral.

The iOS listing further claims the app lets users “stay up to date with articles as they’re published from Infowars.com” — which at least suggests some of the content is ideal to what’s being spouting on Jones’ own website (where he’s only subject to his own T&Cs).

But in order to avoid failing foul of Apple and Google’s app store guidelines, Jones is likely carefully choosing which articles are funneled into the apps — to avoid breaching app store T&Cs against abuse and hateful conduct, and (most likely also) to hook more eyeballs with more soft-ball conspiracy nonsense before, once they’re pulled into his orbit, blasting people with his full bore BS shotgun on his own platform.

Sample articles depicted in screenshots in the App Store listing for the app include one claiming that George Soros is “literally behind Starbucks’ sensitivity training” and another, from the ‘science’ section, pushing some junk claims about vision correction — so all garbage but not at the same level of anti-truth toxicity that Jones has become notorious for for what he says on his shows; while the Play Store listing flags a different selection of sample articles with a slightly more international flavor — including several on European far right politics, in addition to U.S. focused political stories about Trump and some outrage about domestic ‘political correctness gone mad’. So the static sample content at least isn’t enough to violate any T&Cs.

Still, the livestream component of the apps presents an ongoing problem for Apple and Google — given both have stated that his content elsewhere violates their standards. And it’s not clear how sustainable it will be for them to continue to allow Jones a platform to livestream hate from inside the walls of their commercial app stores.

Beyond that, narrowly judging Jones — a purveyor of weaponized anti-truth (most egregiously his claim that the Sandy Hook Elementary School shooting was a hoax) — by the content he uploads directly to their servers also ignores the wider context (and toxic baggage) around him.

And while no tech companies want their brands to be perceived as toxic to conservative points of view, InfoWars does not represent conservative politics. Jones peddles far right conspiracy theories, whips up hate and spreads junk science in order to generate fear and make money selling supplements. It’s cynical manipulation not conservatism.

Both should revisit their decision. Hateful anti-truth merely damages the marketplace of ideas they claim to want to champion, and chills free speech through violent bullying of minorities and the people it makes into targets and thus victimizes.

Earlier this week 9to5Mac reported that CNN’s Dylan Byers has said the decision to remove links to InfoWars’ podcasts had been made at the top of Apple after a meeting between CEO Tim Cook and SVP Eddy Cue. Byers’ reported it was also the execs’ decision not to remove the InfoWars app.

We’ve reached out to Apple to ask whether it will be monitoring InfoWars’ livestreams directly for any violations of its community standards and will update this story with any response.