This column is appearing exactly six years to the day since the first one I wrote for the first issue of Upstate Business Journal. As I approached this milestone, it occurred to me how much some technology has changed in a brief time and how slow others have been to find their footing.
In 2012, cloud computing was anticipated to be the major disrupter of technology. Data centers are dead (just like desktops and Windows XP, right?), pundits wrote. They weren’t wrong about the role cloud computing would play in the changing face of business technology, but they weren’t right either. Large businesses, which were anticipated to be the first adopters, are still largely operating in a hybrid mode.
An IDG survey of 550 large businesses found that 52 percent of the technology environment is noncloud. That surprised me since it’s cost effective and reduces management overhead and capital expenditures. But it also involves a massive commitment of people to change and get out of their comfort zones. The vast majority of implementations fail when the humanware will not adapt. Millions, possibly billions, of dollars in tech enhancements sit gathering dust in corporations’ storage rooms for that exact reason. People just weren’t going to do it.
BYOD and mobility
One thing employees were happy to do is use their own phones, tablets, and laptops at work. So the bring-your-own-device movement — another anticipated disrupter in 2012 — gained a foothold quickly.
BYOD policies are often hybrids with a mix of employee- and company-owned devices in use. Only 20 percent of businesses today are without some BYOD policies, according to an Oxford Research study conducted for Samsung. The BYOD good-or-bad equation seems to teeter here: Employees spend less physical time in the office and more time at home with family, but, in truth, employees never really leave the office. They just take it home with them. Still, the employee ease of adapting to a new paradigm has helped it become standard operating procedure in the vast majority of U.S. businesses.
Big data and artificial intelligence
Nothing has been written about more in the tech and business press over the past six years than big data. In 2012, Inc. magazine listed predictive technology as the top tech innovation to occur in the next year.
It’s true that we produce massive amounts of data points every minute of every day: All our movements, decisions, purchases, desires, and interests are trackable nearly every minute of the day. And yet only 23 percent of companies, according to IBM, have actually thought about a big-data strategy to harness and use all of that information.
That’s not to say that we haven’t moved in that direction. Many companies have developed learning systems and service bots to be the first line of contact with customers, and, of course, robotics has vastly changed manufacturing and even farming. Thanks to the internet of things, consumers have Siri keeping their calendars on the go, and Alexa and Google Home turning the lights on and ordering pizza.
We are reluctant to jump into self-driving cars or adapt to robotic assistants, perhaps because we fear losing control or we know too well what happens when the computer crashes at the most inopportune time.
And predictive technology? Well, a number of new applications are just now being released. But the holy grail, the big get, remains an amorphous goal, not a tangible next step.
Why? Technology has gotten out over its skis. A core group of visionaries can see the promise of big data, artificial intelligence, and machine learning, but most businesses are still mastering spreadsheets (see the Harvard Business Review’s October 2018 issue with 10 key Excel functions).
Is it a good trend or not? Technology has boomed, certainly, in the last decade. But its growth has been constrained by one thing: It’s come down to people, people.