All posts in “Business”

Cloud Health Services, Part 1: Benefits and Complications

The cloud offers a host of potential uses, according to the healthcare industry and academic medical center representatives who participated in a Healthcare Information and Management Systems Society
survey last year.

Application hosting was the top use, identified by 90 percent of the 64 respondents.

Other potential uses cited:

  • Disaster recovery and backup – 84 percent
  • Hosting primary data storage such as application data – 74 percent
  • Hosted email services – nearly 70 percent
  • Managed services – 52 percent
  • Virtual servers – 36.5 percent
  • Security – 36.5 percent

The global healthcare cloud computing market will reach US$35 billion in 2022, from about $20 billion in 2017, according to BCC Research.

Cloud computing lets user organizations avoid heavy capital expenditures, and it requires less skilled IT staff, according to a report from the Cloud Standards Customer Council, an end-user advocacy group dedicated to accelerating cloud adoption.

The cloud also offers the following potential advantages:

  • Easier scalability and ability to adjust rapidly to demand;
  • Better security and privacy for health data and health systems than in-house systems;
  • Improved information sharing through standard protocols — although vendor contracts and technical impediments remain a problem;
  • Support for rapid development and innovation, especially for mobile technologies and the IoT; and
  • Better analytics capabilities, through services such as intelligent business process management suites and case management frameworks to mitigate medical mistakes.

“Cloud services in healthcare is a rapidly evolving area, and a case where cloud makes a lot of sense from a cost management and portability perspective,” Rebecca Wettemann, VP of research at Nucleus Research, told the E-Commerce Times.

However, all is not rosy in cloud healthcare land. Standardization has been a problem, so information sharing is not quite as easy as it should be. Security also has been problematic.

Everyone Wants to Get Into the Act

Major cloud players Google, Microsoft and Amazon have been pushing into the healthcare field, along with several smaller firms. Facebook and Apple also have been showing interest.

“We’re seeing a great deal of interest in the cloud in the healthcare industry,” remarked Joe Corkery, Google Cloud’s head of product healthcare and life sciences.

Most have been offering traditional cloud-based services such as hosting and Software as a Service, as well as security, data storage and data crunching. Google and Apple have been leveraging the Internet of Things through mobile devices.

All providers have an eye on the IoT, which is expected to generate a huge market, sparked by smart cars, homes, cities and the ubiquity of mobile devices.

“Offerings that are more enterprise-focused — like Microsoft Azure, IBM SoftLayer and Dell Virtustream — should have a significant advantage because of their higher focus on security,” observed Rob Enderle, principal analyst at the Enderle Group.

On the other hand, smaller companies “should have a significant advantage, thanks to the strong privacy laws, if they have that extra focus on security,” he told the E-Commerce Times, and don’t have “a bad reputation for the cloud, like Apple, or a bad reputation for privacy, like Google.”

The Healthcare Tower of Babel

Interoperability is a real issue in the healthcare industry. Some blame the problem on legislation — specifically, the 2009 Health Information Technology for Economic and Clinical Health.

The HITECH Act was designed to promote the adoption and use of electronic health records, but the legislation’s overall design pushed interoperability in a limited way, according to Julia Adler-Milstein, an associate professor at the University of Michigan’s School of Public Health.

The U.S. Centers for Medicare and Medicaid Services and the Office of the National Coordinator for Health IT put off defining criteria for the Health Information Exchange until the second stage, Adler-Milstein pointed out.

Further, interoperability is not just a technological issue, she noted. There are also governance and trust issues, business agreements and confidentiality issues.

“Multiple standards, as well as flavors of those standards for medical and health data, have been developed over time … based on the changing needs of the technology products being developed,” Google’s Corkery told the E-Commerce Times.

The Office of the U.S. National Coordinator for Health IT earlier this year released its 2018 Interoperability Standards Advisory.

However, the advisory is for informational purposes only. It is non-binding and does not create or confer any rights or obligations for or on any person or entity.

International standards organizations have stepped in to take up the slack. Health Level 7 International “has taken an active role in the development of standards,” Corkery noted.

HL7v2 is “the dominant data exchange format used in clinical settings today,” he added.

Fast Healthcare Interoperability Resources — managed by the nonprofit FHIR Foundation, which is closely affiliated with HL7 — is an emergent standard, said Corkery.

At the industry level, the Google Healthcare application programming interface is an attempt to get around the interoperability problem.

The API provides “a cloud-based implementation of several widely deployed standards — HL7v2, FHIR and DICOM — to ingest, store, query and analyze data using protocols and formats that clinical systems use today,” Corkery said. “The intent … is not to replay existing standards, but to delivery interoperability between clinical systems and Google cloud services.”

Stay tuned for Part 2.

Richard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

‘Don’t buy electric!’ — car dealerships dismiss electric vehicles, study finds

Car dealerships in Scandinavia — and perhaps beyond — have something against electric vehicles. That’s the conclusion of an investigative study conducted by researchers in Nordic countries, where they visited car dealerships and found agents actively discouraging going electric.

“We essentially found that, contrary to conventional wisdom, most car dealerships do not want to sell electric vehicles, even though they cost more … than ordinary vehicles,” Benjamin Sovacool, a professor of energy policy at the University of Sussex and one of the study’s researchers, told Digital Trends. “This creates a key barrier to adoption that has not yet been addressed by policy, let alone explored systematically in research.”

To conduct their investigation, the researchers visited 82 car dealerships in Denmark, Norway, Sweden, Iceland, and Finland. They arrived under the guise of being everyday customers, and later spoke to experts to give context to their experience.

In two-thirds of their visits, agents steered the researchers toward gas-powered cars, sometimes dismissing electric vehicles (EVs) outright. In over three-fourths of the visits, vendors didn’t even inform the researchers that they sold EVs.

At one dealership, the researchers report being told, “Do not buy this electric car, it will ruin you financially.”

“Dealers act as a key type of agent known as an intermediary — someone between a product and service on the one hand, and the user, customer, or owner on the other hand,” Sovacool said. “They can thus exert very strong influence over what consumers think and do.”

Sovacool notes that the dealer’s role is particularly strong today, given that the relative newness of EVs means consumers don’t know much about them. “We hypothesized that dealers would actually be more supportive or EVs, especially in the Nordic region, and were somewhat shocked by our findings,” he said.

Gerardo Zarazua de Rubens, one of the lead researchers on the study, thinks that attitudes of governments and industry leaders have a tendency to “trickle down” to the selling floor, “and do affect both how car dealerships and salespersons promote and sell vehicles.” This seems to have the subsequent effect on consumers not investing in electrical options.

“If these countries … are true about using [electric vehicles] as a tool for decarbonizing transportation, there needs to be a systematic revision of their policies and strategies to harmonize them and create a level-playing field, where the EV is a fair option in comparison to a petrol or diesel car,” De Rubens, a business and technology doctoral fellow at Aarhus University, said. “We cannot keep favoring petrol and diesel vehicles and expect that decarbonization … targets will be achieved.”

The report was published this week in the journal Nature Energy.

Editors’ Recommendations

Micron’s four-level memory tech stacks the storage in its new SSD

micron outs first four level cell nand ssd 5210 ionMicron is setting an industry first with the release of its 5210 Ion solid-state disk. The drive relies on the company’s QLC NAND technology, packing 33 percent more storage capacity than a similar SSD with TLC NAND. The drive targets markets where clunky hard drives still remain the dominant storage device due to their large capacities and lower prices. 

QLC is short for quad-level cell, meaning each memory cell can hold four bits of data. It’s a means of providing a high storage capacity at a lower cost versus using a single-level design to achieve the same storage amount. In other words, you could more than double the capacity of a 500GB single-level cell SSD by using multi-level cell NAND without increasing the price. 

Typically, the most recent SSDs rely on triple-level cell NAND flash memory. Depending on the manufacturer, the memory is either spread out horizontally like a city block or vertically like a skyscraper. Micron’s “3D” QLC NAND is the latter, enabling more storage without the constraints of the drive’s horizontal physical space by vertically stacking 64 layers comprised of four-level cells. 

The company’s new SSD targets datacenters that are in dire need of retiring their hard drive farms. Although SSDs are not fail-proof, they have no moving parts, thus they’re not only power efficient and quiet, but more dependable than hard drives. The only real hook keeping hard drives in the datacenter is their dollar-per-gigabyte ratio, as they’re cheaper than SSDs with the same capacity. Micron’s QLC 3D NAND aims to change that. 

The Micron 5210 ION SSD connects via a standard SATA port just like a hard drive. At the time of this writing, we didn’t have the drive’s read and write speeds, but SATA-based SSDs are always faster than hard drives, but slower than SSDs connected via a PCIe-based M.2 slot. Again, Micron’s SSD aims to replace current hard drives in systems that likely don’t have an M.2 slot. They’re only 0.275 inches thick too (7mm) so you can cram more drives into the same space occupied by traditional 3.5-inch hard drives. 

Micron first introduced its 5200 family of enterprise-focused SSDs in January. They’re split into four categories: Eco, Max, Pro, and the new Ion drives. The Eco models are capable of around one drive write per day, which means you can rewrite the drive’s full capacity once per day without failure during the warranty period. The Pro models have an endurance of around two drive writes per day while the Max models have an endurance of five writes per day. 

That said, the Eco, Max, and Pro models are based in Micron’s three-level cell NAND memory while the 5210 Ion is the company’s first to use its foul-level cell memory. The SSD is shipping to “strategic enablement partners and customers” for now followed by a broad availability this fall. Capacities will range between 1.92TB and 7.68TB at prices to be announced closer to launch.

Microsoft acquires conversational A.I. technology firm Semantic

Alexa and Cortana?

Microsoft is betting big on artificial intelligence. In a blog post published Sunday, May 20, the Redmond, Washington-based technology giant announced the acquisition of Semantic Machines, a company focused on building conversational A.I. “Their work uses the power of machine learning to enable users to discover, access, and interact with information and services in a much more natural way, and with significantly less effort,” Microsoft notes. The move could help give Cortana the leg up it needs on competitors like Amazon Alexa and Google Assistant.

“AI researchers have made great strides in recent years, but we are still at the beginning of teaching computers to understand the full context of human communication,” wrote David Ku, CVP and chief technology officer of Microsoft A.I. and Research. “Most of today’s bots and intelligent assistants respond to simple commands and queries, such as giving a weather report, playing a song or sharing a reminder, but aren’t able to understand meaning or carry on conversations.” But conversational A.I. could turn this norm on its head, and Semantic could be at the forefront of this change.

Semantic has previously worked with major tech firms, leading automatic speech recognition development for Apple’s Siri. In essence, Semantic employs machine learning in order to provide context to chatbot conversations, making dialogue seem a bit more natural and better-flowing.

“With the acquisition of Semantic Machines, we will establish a conversational AI center of excellence in Berkeley to push forward the boundaries of what is possible in language interfaces,” wrote Ku. “Combining Semantic Machines’ technology with Microsoft’s own A.I. advances, we aim to deliver powerful, natural and more productive user experiences that will take conversational computing to a new level. We’re excited to bring the Semantic Machines team and their technology to Microsoft.”

Thus far, no financial details of the acquisition have been disclosed.

Microsoft is by no means the only company trying to make strides when it comes to artificial intelligence and its smart assistants. Amazon, for example, is trying to give Alexa a better memory, while Google is making bots so human-esque that they’re practically indistinguishable from humans during phone conversations with its new Duplex offering. We’ll just have to see how Microsoft keeps up.

Editors’ Recommendations

If Trump doubles USPS shipping rates for Amazon, will you end up paying?

President Donald Trump personally asked the U.S. Postmaster General Megan Brennan to double shipping rates for Amazon and some other companies — and some financial experts worry Amazon may pass the costs along to consumers.

Postmaster Brennan has refused to comply with the president’s requests, The Washington Post reported this week, saying that the United States Postal Service’s agreements with Amazon and other companies are bound by contracts and only subject to review by the appropriate regulatory agencies. Brennan further stated that Amazon’s relationship with the USPS was a profitable one for the mail carrier.

Despite Brennan’s reassurances, President Trump recently signed an executive order ordering a review of the postal service. It’s possible that this review could result in changes regarding how much Amazon and other companies are charged for package deliveries. And according to a recent analysis by Credit Suisse, the online shopping giant could offset the increased cost of shipping with a $20 hike in annual Prime subscription fees.

Brennan and Trump have discussed the matter of the USPS’s relationship with Amazon several times since the real estate mogul took office in 2017. The most recent meeting occurred about four months ago. In addition to these meetings with Brennan, Trump and three of his key advisers have also discussed this matter. The Post’s sources indicate that Trump’s advisers are split on the matter of Amazon: Some believe Amazon is being subsidized by the postal service, while others argue that the arrangement is beneficial to the USPS.

The details of these meetings haven’t been revealed; we know that they have included Treasury Secretary Steven Mnuchin, former National Economic Council Director Gary Cohn, and Domestic Policy Council Director Andrew Bremberg. Prior to his departure in March, Cohn was one of Amazon’s biggest defenders in the Trump administration.

Despite Amazon and the postal services’ assurances that USPS makes money off of Amazon, Trump appears unconvinced.

“I am right about Amazon costing the United States Post Office massive amounts of money for being their Delivery Boy,” the president tweeted on April 3. “Amazon should pay these costs (plus) and not have them bourne by the American Taxpayer. Many billions of dollars. P.O. leaders don’t have a clue (or do they?)!”

Trump’s executive order could result in changes regarding how much USPS charges Amazon. This could, in theory, mean higher prices for Amazon customers — who just saw Amazon raise Prime costs by $20. Is another hike on the horizon?

Editors’ Recommendations