All posts in “Entrepreneurship”

Overnight success now requires a little more time

Ten years ago the iOS App Store launched — and the mobile revolution was off. Entrepreneurs everywhere rallied to take advantage, building category-defining consumer companies like Twitter, Uber, Lyft and Square, among many others.

There’s no better time for an entrepreneur to start a company than when a new platform like mobile emerges. The rising tide in these moments becomes a tsunami: Eager customers descend on services through word of mouth and new acquisition channels; there’s outsized press interest; and sales take off in part due to growth of the platform itself.

Now is not one of these periods. Mobile appears mature, and the next great enabling platform is still just past the horizon. That’s why many early-stage VCs have shifted their focus away from consumer and to other new enabling technologies, such as autonomous vehicles, blockchain and AI/ML.

I have a different view. I think now is a great time to build consumer companies, even without a new platform. There are three reasons for this. First, the internet has created big problems for humans, organizations and society, which entrepreneurs can attack at scale. Second, the first wave of mobile-enabled companies have laid a foundation — including processes, seasoned executives and business models — that new entrepreneurs can borrow. And third, mobile technology is still changing and evolving.

Let’s take a closer look at all three.

Solving big problems

The last wave of breakout companies created interactive platforms (Twitter, Snapchat, Instagram, etc.) that have entertained many. They didn’t solve big societal problems. There’s now a big need — and big opportunity — for companies that can help people save time, money and sanity, even as they build great businesses.

Most of us now realize the major problems that a connected, mobile, always-on world has wrought. These include:

  • Income inequality. Lower-income Americans are struggling more than ever. Entrepreneurs should be thinking of ways to help folks where they need it the most: the pocketbook. That might mean unlocking found money, ensuring that available financial resources are being used wisely or saving consumers from the growing number of “gotchas” imposed by financial institutions.
  • Too many choices. When you can buy or choose anything, it’s hard to pick what you actually want. There are wide-open opportunities for concierges, curation and trusted guides.
  • A lack of intimacy. With everything online and available at the touch of a keypad, genuine human interaction has become more rare. There’s a need for companies that can provide real care and curation for matters that affect our daily lives.

Newly available resources

After a decade of building companies for mobile, there are now untold stories, battle scars and people available for future companies to learn from. This makes it easier for startups to assemble playbooks and experienced teams. It also reduces the downside risk for investors, opening new paths to capital for companies that need it.

For instance, it’s now clear that consumer brands must define, own and curate an end-to-end experience. A great new example is GOAT, the online sneakerhead marketplace. Faced with a sneaker market full of rampant knock-offs, the founders invested in a capital- and time-intensive process to manually inspect every shoe for authenticity. The result is an experience that every sneakerhead loves and a breakthrough consumer brand.

Building a breakout consumer platform will be more complex, more challenging and often more capital-intensive than it was for the prior generation.

There are also lots of executives and teams that know how to lead and manage complex operations, especially on the ground. This is crucial to scale logistically complex ideas like Opendoor, Instacart and others.

The other thing needed to help scale these companies is capital. And right now, there are two particularly relevant new kinds of investors: 1) mega equity funds like SoftBank Vision Fund, and 2) alternative lending funds that provide non-dilutive capital to companies to finance the acquisition of traditional assets. Those capital sources enable companies like Opendoor (disclosure: I’m a personal investor) to own and manage a truly delightful end-to-end experience.

Mobile today is not mobile tomorrow

Mobile devices have come a long way over the last decade. And there will be many more meaningful improvements in the near future, allowing for new uses and new companies.

I anticipate breakthroughs that will boost the ability of the chips and subsystems on a phone to perform optimally for far longer. Right now, these are throttled due to heating issues and other problems. As companies solve these issues, they’ll create order of magnitude improvements on what our phones are capable of, bringing technologies like VR and AR, to take two examples, far forward into everyday use.

On the network side, 5G and subsequent buildouts will meaningfully change what kinds of bandwidth we can handle, enabling even more data and compute to be in the cloud.

Mobile today is about one-to-many broadcast platforms like Instagram, Twitter and Facebook. Tomorrow’s great consumer companies will leverage a better vector: one-to-one customer intimacy. Companies like Grove Collaborative (disclosure: Mayfield is an investor) are experiencing hypergrowth in part by using real people connecting with consumers over text to bring a curated, personalized experience to shopping for household staples. I expect this to be a major trend, with the companies that earn the right to communicate more with customers the ones that win.

Building a breakout consumer platform will be more complex, more challenging and often more capital-intensive than it was for certain titans of the prior generation. But for those with the vision and substance to bring a valuable service to the world that solves real problems, the resources and emerging technologies will be there to help create the next groundbreaking consumer brand.

Chatbots Gaining Altitude

How do they really know? That’s the question that immediately comes to mind in reviewing the top-level data from Voxpro’s recent survey of customers and their relationship with chatbots. The data show that 68 percent of consumers haven’t used chatbots to contact a brand. About 1,000 people answered the survey.

How reliable is that number, though? I’m not disrespecting Voxpro here — just the opposite. Isn’t it at least possible that some consumers have interacted with bots without realizing it? Of course it is, and it’s therefore reasonable that the true number of people who haven’t used a chatbot actually might be smaller.

One Direction

Consider another data bit from Voxpro’s study: Fifty-six percent of survey respondents said they hadn’t used chat or other automated features because they prefer dealing with real people. That leaves 12 percent not using because… I don’t know.

Whatever the true measure is, within this narrow range, I’d say we’re right on schedule. It’s early days in the automated assistant game, and the numbers so far only confirm this.

We’re still working toward finding the right circumstances for the technology — another indicator that it’s early days. Despite this being early, however, the objective is still to find better ways to support the customer experience, though I’d prefer to say we’re trying to promote customer engagement.

“Engagement” may sound foreign to many ears when discussing having our machines interacting with customers — until you realize that engagement is meant to be a two-way street, and the only important direction is from customer to brand.

The purpose of chatbots, let us never forget, is to instill a feeling of satisfaction and even confidence in customers that will make them not only want to return but also tell others about oh, yes, their experience.

End of Personal Info Sharing?

Other noteworthy data points include stratification by age cohort, with 32 percent of survey respondents overall reported using chatbots, but only 22 percent of those 65 years-old and over having done so. About half were concerned about retaining the ability to talk with a person if they couldn’t get satisfaction through automation. There’s nothing wrong with that.

Also, from the “stuck between a rock and a hard place” department, 45 percent of people 55 years old or older would prefer that a brand have their information before an interaction, presumably to save the time and effort associated with giving it. However, the same number — 45 percent — didn’t want a brand to store their information because they were concerned about data breaches.

Finally, it appears from the survey that 18- to 24-year-olds (32 percent to 45 percent) trusted brands with their data even less than older people. It used to be that younger people — we called them “digital natives” — were OK with sharing their info, but this study indicates that era might be over.

Finding Meaning

It might be best not to draw too much meaning from this study. It’s a snapshot in a rapidly changing market, after all, and the data undoubtedly will reveal different things next time.

Still, smart vendors might pull some knowledge out of it nonetheless. For instance, if people really don’t know if they’ve interacted with chatbots already, that’s terrific. It means the technology has reached the point of being good enough, and its utility likely will grow.

It also could mean that at least in some situations, chatbots are a really good fit. However, it also means that we still have work to do gathering customer data from other sources to build composites of customers (with artificial intelligence) that can work in a pinch.

All told, this is an interesting view into the workings of one of CRM’s newest branches, and it tells us that the market place is operating just as it should.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

Denis Pombriant is a well-known CRM industry analyst, strategist, writer and speaker. His new book, You Can’t Buy Customer Loyalty, But You Can Earn It, is now available on Amazon. His 2015 book, Solve for the Customer, is also available there.
Email Denis.

When the News Went Live… Online

The shift of live content from traditional broadcast television to online sources has significantly changed the video industry — and especially, the news media industry. News typically derives its value and attracts viewers with live updates and real-time engagement.

The rise of live online alternatives extends those capabilities to nontraditional voices in the digital realm, including bloggers, social media influencers, independent video creators, digital networks, and over-the-top news services.

The combination of digital creators and more professional digital networks, like Cheddar, offers consumers more news options than ever, and consumers easily can choose the type of news media they want to watch.

Diverse Personalities, Homogenous Coverage

They also have the technology in hand to do so, as U.S. broadband households now have more than eight connected consumer electronic devices, on average, and WiFi coverage is increasingly ubiquitous, allowing high-speed wireless connections almost anywhere.

These factors have created a more crowded news industry, where traditional and alternative news media compete for attention and viewers across a variety of platforms. While this means a diversity in news personalities, it often leads to a homogeneity in the news stories being covered, as all players scramble to cover the top breaking stories in order to maximize their audience.

A decline in local broadcast news viewership has exacerbated the decline in diversity of coverage. Consumers have been replacing local news with news from social media and national outlets.

Recent surveys have confirmed the overall growth of social media news, growth among viewers 50 and older, and social media’s place among young viewers. Sixty-seven percent of U.S. adults reported getting some of their news from social media in 2017, up from 62 percent in 2016, Parks Associates reported.

Chart: Live TV Broadcast Sources on a TV Set by Age

The national media giants rely on commentary from experts and media personalities to supplement breaking news, along with debate and talk shows that discuss news-related topics, in order to hold their audience.

Even Netflix has tried to enter the talk-show space, with limited success so far, by launching shows featuring David Letterman, Chelsea Handler and Michelle Wolf, among others.

The Video Gamble

These trends have left local broadcast news outlets in a tough spot, as they do not have competitive advantages in either breaking news or news analysis. This likely will drive increased consolidation, digitization, or reduction in local broadcast news outlets.

Even the national outlets will lean more heavily on online sources for their news coverage. Many of these companies, like Fox News and CNN, will simulcast coverage from television, while creating live shows specifically for online viewing.

Additionally, social media Q&As about particular topics, articles, and events have become commonplace within online news. These can occur with writers themselves, as is the case with Politico‘s coverage of the midterm elections, or in the case of news aggregators like Yahoo! and Google, which have a limited staff of writers or no writers at all, but host live streams of other news stations instead.

With news shifting to digital in such a pervasive way, many changes have occurred with regard to the way news organizations drive viewers to their content. Specifically, news organizations have been using video as a larger part, or the only part, of their content mix.

Starting in 2017, a number of news organizations, such as MTV News and Vocativ, pivoted to video, where they abandoned long-form articles, fired a majority of their writing and editing staff, and pursued a solely video-focused strategy.

While there are advantages to the pivot to video in terms of viewer engagement and session length, this strategy presents a number of challenges. News media companies employing this video strategy typically see significantly fewer visitors to their website, but they score higher engagement and longer sessions with those fans, who are more passionate than the average viewers.

That is an important payoff, because good video content is extremely difficult and resource-intensive to make. News media must adapt their video content to a variety of different platforms.

Finally, social media sites are not beholden to third-party content and can opt to change the algorithm or interface at any time, in the hope of improving the user experience. Facebook sparked controversy recently, related to alleged misrepresentations of its video content metrics.

New Opportunities

Media organizations, which build their digital strategies based on these algorithms, can see their traffic from social media drop when these changes occur, as happened recently to BuzzFeed, HuffPost and Mashable.

News organizations will address these challenges, as the need for video and online live options is a key part of the new strategy for news and content organizations. The decline in viewing on linear television should not be mistaken for dwindling consumer interest in live news coverage.

Live video in a variety of content categories — sports, news and events — has tremendous appeal to all viewers and continues to play a critical role in the content business. Consumers seem more than willing to watch ads during online news coverage as opposed to scripted television, indicating there are new business opportunities in this area.

The shift of live content to online, enabled by the proliferation of connected devices, creates new and appealing content options for both providers and consumers. Alternative platforms have stepped in to capture a portion of the live audience, and services like Instagram Live and Twitter have led the charge in live online video entertainment.

Major national news organizations, accustomed to viewers turning to them, now need to go where the consumers are, which increasingly is online and on social media platforms.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

Billy Nayden is a research analyst at
Parks Associates.

Getting Clarity on the Private vs. Public Cloud Decision

This story was originally published on June 13, 2018, and is brought to you today as part of our Best of ECT News series.

News flash: Private cloud economics can offer more cost efficiency than public cloud pricing structures.

Private (or on-premises) cloud solutions can be more cost-effective than public cloud options, according to “Busting the Myths of Private Cloud Economics,” a report 451 Research and Canonical released Wednesday. That conclusion counters the notion that public cloud platforms traditionally are more cost-efficient than private infrastructures.

Half of the enterprise IT decision-makers who participated in the study identified cost as the No. 1 pain point associated with the public cloud. Forty percent mentioned cost-savings as a key driver of cloud migration.

“We understand that people are looking for more cost-effective infrastructure. This was not necessarily news to us,” said Mark Baker, program director at Canonical.

“It was interesting to see the report point out that operating on-premises infrastructure can be as cost-effective as using public cloud services if done in the right way,” he told LinuxInsider.

Scope of the Report

The Cloud Price Index, 451 Group’s tracking of public and private cloud pricing since 2015, supplied the data underpinning the latest report. Companies tracked in the Cloud Price Index include but are not limited to Amazon Web Services, Google, Microsoft, VMware, Rackspace, IBM, Oracle, HPE, NTT and CenturyLink.

The Cloud Price Index is based on quarterly surveys of some 50 providers across the globe that together represent around nearly 90 percent of global Infrastructure as a Service revenue, noted Owen Rogers, director of the Digital Economics Unit at 451 Research.

“Most providers give us data in return for complimentary research. Canonical asked us if they could participate as well. Any provider is welcome to submit a quotation and to be eligible for this research,” he told LinuxInsider.

Providers are not compared directly with each other directly because each vendor and each enterprise scenario is different. It is not fair to say Provider A is cheaper than Provider B in all circumstances, Rogers explained.

“We just provide benchmarks and pricing distributions for a specific use-case so that enterprises can evaluate if the price they are paying is proportional to the value they are getting from that specific vendor,” he said. “Because we keep individual providers’ pricing confidential, we get more accurate and independent data.”

Trend Toward Private

The private cloud sector continues to attract enterprise customers looking for a combination of price economy and cloud productivity. That combination is a driving point for Canonical’s cloud service, said Baker.

“We see customers wanting to be able to continue running workloads on-premises as well as on public cloud and wanting to get that public cloud economics within a private cloud. We have been very focused on helping them do that,” he said.

Enterprise customers have multiple reasons for choosing on-premises or public cloud services. They ranges from workload characteristics and highly variable workloads to different business types, such as retail operations. Public clouds let users vary their capacity.

“You see the rates of innovation delivered by the public cloud because of the new services they are launching,” said Baker, “but there is a need for some to run workloads on-premises as well. That can be for compliance reasons, security reasons, or cases where systems are already in place.”

In some cases, maintaining cloud operations on-premises can be even more cost-effective than running in the public cloud, he pointed out. Cost is only one element, albeit a very important one.

Report Takeaway

The public cloud is not always the bargain buyers expect, the report suggests. Cloud computing may not deliver the promised huge cost savings for some enterprises.

Reducing costs was the enterprise’s main reason for moving to the cloud, based on a study conducted last summer. More than half of the decision-makers polled said cost factors were still their top pain point in a follow-up study a few months later.

Once companies start consuming cloud services, they realize the value that on-demand access to IT resources brings in terms of quicker time to market, easier product development, and the ability to scale to meet unexpected opportunities.

As a result, enterprises consume more and more cloud services as they look to grow revenue and increase productivity. With scale, public cloud costs can mount rapidly, without savings from economies of scale being passed on, the latest report concludes.

Factoring In Cloud Costs

Enterprises using private or on-premises clouds need the right combination of tools and partnerships. Cost efficiency is only possible when operating in a “Goldilocks zone” of high utilization and high labor efficiency.

Enterprises should use tools, outsourced services and partnerships to optimize their private cloud as much as possible to save money, 451 recommended. That will enhance their ability to profit from value-added private cloud benefits.

Many managed private clouds were priced reasonably compared to public cloud services, the report found, providing enterprises with the best of both worlds — private cloud peace of mind, control and security, yet at a friendlier price.

Managed services can increase labor efficiency by providing access to qualified, experienced engineers. They also can reduce some operational burdens with the outsourcing and automation of day-to-day operations, the report notes.

Convincing Conclusions

While public cloud services can be valuable in many circumstances, they are not necessarily the Utopian IT platform of the future that proponents make them out to be, observed Charles King, principal analyst at Pund-IT.

“As the report suggests, these points are clearly the case where enterprises are involved. However, they are increasingly relevant for many smaller companies, especially those that rely heavily on IT-based service models,” he told LinuxInsider.

An interesting point about the popularity of private cloud services is that their success relates to generational shifts in IT management processes and practices, King noted. Younger admins and other personnel gravitate toward services that offer simplified tools and intuitive graphical user interfaces that are commonplace in public cloud platforms but unusual in enterprise systems.

“Public cloud players deserve kudos for seeing and responding to those issues,” King said. “However, the increasing success of private cloud solutions is due in large part to system vendors adapting to those same generational changes.”

Canonical’s Place

Canonical’s managed private cloud compares favorably to public cloud services, the report found. Canonical last year engaged with 451 Research for the Cloud Price Index, which compared its pricing and services against the industry at large using the CPI’s benchmark averages and market distributions.

Canonical’s managed private cloud was cheaper than 25 of the public cloud providers included in the CPI price distributions, which proves that the benefits of outsourced management and private cloud do not have to come at a premium, according to the report’s authors.

High levels of automation drive down management costs significantly. Canonical is a pioneer in model-driven operations that reduce the amount of fragmentation and customization required for diverse OpenStack architectures and deployments.

That likely is a contributing factor to the report’s finding that Canonical was priced competitively against other hosted private cloud providers. Canonical’s offering is a full-featured open cloud with a wide range of reference architectures and the ability to address the entire range of workload needs at a competitive price.

More Sophisticated Understanding

It is not so much a divide between private and public cloud usage in enterprise markets today, suggested Pund-IT’s King, as a case of organizations developing a clearer understanding or sophistication about what works best in various cloud scenarios and what does not.

“The Canonical study clarifies how the financial issues driving initial public cloud adoption can and do change over time and often favor returning to privately owned cloud-style IT deployments,” he explained. “But other factors, including privacy and security concerns, also affect which data and workloads companies will entrust to public clouds.”

A valid case exists for using both public and private infrastructure, according to the 451 Research report. Multicloud options are the endgame for most organizations today. This approach avoids vendor lock-in and enables enterprises to leverage the best attributes of each platform, but the economics have to be realistic.

It is worth considering private cloud as an option rather than assuming that public cloud is the only viable route, the report concludes. The economics showcased in the report suggest that a private cloud strategy could be a better solution.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open source technologies. He has written numerous reviews of Linux distros and other open source software.
Email Jack.

When Is the Time to Hire a Cyber Specialist?

Cybersecurity has been becoming a larger and larger concern for organizations. Nowadays, most organizations — regardless of size, industry, location, or profit vs. nonprofit status — find themselves directly or indirectly impacted by cybersecurity.

Even though the topic itself is increasing in importance, it remains a truism that many smaller organizations (and in fact, some mid-sized ones) don’t have specialized security expertise on staff.

That isn’t to say that there’s nobody working on security-relevant tasks in those organizations. They may have personnel that perform security tasks along with their other responsibilities, or they may have outsourced aspects of security to external service providers. However, even though aspects of cybersecurity are being accomplished in those organizations, they’re happening without a single, named, accountable individual overseeing the function.

This can be problematic as an organization grows. It can lead to uncomfortable discussions with clients, for example. It can result in potential audit findings, or put organizations out of compliance with regulatory mandates in some situations, or have numerous other undesired consequences.

For those organizations the question then becomes this: When is the right time to assign someone to security full time, or to shift responsibilities so that oversight falls on a single accountable individual?

Is it when the organization reaches a certain size threshold (e.g., when it gets to 100 workers)? Is it when the organization reaches a certain volume of revenue? The answer, it turns out, is more complicated than any hard and fast rule. That said, there are a few factors to consider that can directly inform the decision as to when is the right time to assign a resource full time.

Why Designate Someone?

To best understand when that time is, it’s helpful to assess the value provided by having an assigned staff member in the first place. It’s advantageous across several dimensions.

First, having a single individual responsible for cybersecurity establishes accountability. When responsibility is distributed among multiple individuals — or when responsibility is otherwise unclear — important security-relevant tasks can slip through the cracks. Designating someone, clearly and unambiguously, helps control this.

Second, it helps defuse conflicts of interest. Sometimes appropriate security due diligence means pushing back on otherwise-valuable activities. When an individual’s job includes both security and something else in equal measure, situations can arise when that person will need to choose one role over the other.

Consider, for example, a situation in which someone is responsible for both security and deploying business applications. What happens when, perhaps because of a software flaw or some other reason, fielding an application into production potentially puts the organization at risk?

In that case, the individual with those combined responsibilities would have to decide whether to release the application (because of the application deployment function) or to push back on the application (because of the security function.) Making the security function independent and focused would help prevent such situations from arising.

How Will You Know?

The point is that there’s clear value in assigning it specifically to someone. Still, as a practical matter, the size of the organization can make doing so a nonstarter, despite the benefits. For example, an organization with one employee obviously wouldn’t be able to allocate its sole employee to a full-time security role. If it did, it probably wouldn’t stay in business very long.

On the other hand, it would be ludicrous to imagine a large, multinational bank without someone assigned to security. But when is that transition appropriate? It’s not always clear-cut.

That said, there are situations that can make the decision easier — for example, when there is a regulatory, legal or contractual requirement to assign someone. HIPAA, for example, specifically requires that organizations designate a named security officer.

Likewise, the PCI DSS contains language about assignment of security duties. While in both cases the regulation doesn’t specifically state that these individuals do only security and nothing else, the fact that the regulation contains this language can help reduce ambiguity.

Beyond regulatory requirements, though, customer expectations can help drive the decision. If you’re an organization that services security-conscious clients, for example, having an accountable individual assigned to security can help address customer expectations, provide a central point of contact for customer security-related questions, and otherwise streamline the sales and service delivery process.

Ultimately, the decision as to when to hire specialized staff will vary, based on a number of organization-specific factors. That said, one useful measure to consider in evaluating this decision is as a function of two factors: staff time and organizational risk.

From a time-utilization standpoint, a useful time to consider allocation of specialized staff comes when organizations reach the point that employees are having to defer urgent or high-imperative security tasks because of other commitments or deadlines. Meaning, if you’re postponing something that is important to keeping your organization protected because of other things on staff members’ plates, this should be a warning sign that it might be time to shift responsibilities.

This, of course, implies that you know what security-relevant tasks exist in the first place. If you don’t, this is also a potential warning sign. You might consider a short-term exercise of assessing your organization’s security pain points — either by making time for existing staff to evaluate it, if they have the skills, or working with a trusted advisor to help you find out how many tasks are being overlooked, and the potential impact as a result.

Either way, bear in mind that hiring cybersecurity specialists can be more difficult than hiring for other technology-forward positions. It can be time consuming to find the right fit, and it sometimes can take six months or more to find the right blend of skills in the right areas.

This means that, ideally, you’ll begin the search process a few months ahead of when you actually need that resource. This is helpful to keep in mind so that you don’t get caught out when the time to fill that position becomes urgent.

Ed Moyle is general manager and chief content officer at Prelude Institute. He has been an ECT News Network columnist since 2007. His extensive background in computer security includes experience in forensics, application penetration testing, information security audit and secure solutions development. Ed is co-author of Cryptographic Libraries for Developers and a frequent contributor to the information security industry as author, public speaker and analyst.