- Welcome Guest
- Sign In
Last week my good friends Brent Leary and Paul Greenberg opened up their online show, CRM Playaz, to an executive roundtable discussion with some of the movers and shakers in our CRM world.
On hand to discuss the future of CRM were:
Yours truly was also there to lend my thoughts once the CRM executives had offered theirs.
I know almost all of them and they are friends. They acquitted themselves well in outlining their views of where CRM is going; and since they are the top of the food chain in their respective companies, there is a close correlation between their views and the direction of their CRM programs. As it should be.
With all that as preface I looked askance at the discussion. Unfailingly, each speaker espoused the centrality of the customer and visibility through 360 degrees of customer interaction. It’s hard to disagree with that — though I will.
The customer has always been of central importance to any entity trying to sell something. Just for grins and giggles, I can trace customer centrality all the way back to the ancient Greek historian, Herodotus who described how merchants from Asia Minor would beach their boats and sell their wares to the locals. When you beach your boat, you have to hang around for at least twelve hours before you can take off and Herodotus describes some big boats and the need to beach them for several days to sell out.
There’s nothing like being stranded in a foreign land to make a merchant want to ensure the complete customer satisfaction. So, customer 360, or whatever you want to call it, has a deep history that supersedes anything we’ve done with software in the last 25 years — or whatever we might think up in the next.
Still, the idea of seeing the customer in full stays with us for a very good reason. Customer orientation has to be learned by each generation of people who wish to work in customer-facing industries. Even in Herodotus’ time, merchants were far from perfect. He reports that when the boats were nearly empty of cargo and the tide was rising, it was a perfect time for stealing women, who most of the customers were, and sailing off.
Unwittingly, Herodotus also documents one of the earliest uses of the logical fallacy, the non sequitur, when he quotes the merchants saying, “We didn’t steal your women, they wanted to come with us.” He also uses this common case as an example of how the Trojan War started when Paris, a prince from Asia Minor, abducted Helen who was married to Menelaus, king of Sparta, a warrior-led Greek city-state. It didn’t end well for Paris or Troy.
But that’s literally ancient history.
Last week’s festival du customer, I think, exposes multiple deficiencies in our approach to CRM. First, it diminishes differentiation at a time when vendors should want to emphasize it. If everyone is all about customers, then how do you choose among them? It’s reminiscent of the political season when everyone is for apple pie, hot dogs, and motherhood. Besides, what else is there except the customer?
Second, if customer centricity has been with us for millennia, then touting it is not very forward looking, though it supports the idea of generational adoption. Third, despite all of the technology available and the long history of CRM, user organizations still don’t get it.
My research, which I’ve quoted here before, shows that too many organizations still don’t employ the tools needed to support grandiose ambitions of customer centricity, especially when it comes to having and using AI and machine learning.
For this we need to look elsewhere, away from the CRM vendors who are churning out good, new functionality. We need to take a hard look at the companies themselves, and their cultures.
What I see when I look at the CRM-consuming companies is a minority that use it strategically. The rest have a more or less consistent view that says we’ll take a little of this CRM stuff, but not too much lest we have to make serious changes.
These companies are too often afraid of their sales reps and consequently demand minimum compliance with entering data or sticking to a fact-based sales strategy.
Those companies are inconsistent as well. While they demand too little of sales, they demand rigorous accounting of marketing spend and require scrupulous record keeping in service settings even when that means updating multiple systems because their customer-facing systems are not well integrated.
There’s a strong trend in human nature to go with your gut, to make snap decisions and execute, rather than being more contemplative which might provide a better result.
Not long ago the psychologist, Daniel Kahneman, won a Nobel Prize in Economics for research that informs the foundations of behavioral economics and indirectly CRM. To be overly succinct, it turns out that people really do make rational decisions for emotional reasons.
Somewhere in the hierarchy of each business, someone needs to take responsibility for ensuring that we make rational decisions in our customer-facing processes. This means not only adopting CRM and training our people in its use, but then insisting on it.
I think that’s where the leading edge is today. With all of the good technology available right now, the emphasis should be on training in CRM techniques, especially trusting the data. That’s a long way from simply teaching people where the enter key is, because we live in a vastly changed business environment.
We call it the digital disruption and that’s alliterative and catchy. But maybe we should be focusing on people-powered outreach.
Apple refreshed its iPhone product line Tuesday, introducing four new models, as well as a downsized version of its HomePod smart speaker.
All of the new iPhones are built around Apple’s blazing fast A14 Bionic chip, support all flavors of 5G wireless connectivity, and are protected by a Ceramic Shield that protects the phone from damage after a fall four times better than in previous models.
“The Ceramic Shield may sound trivial to some people, but when you can improve the drop test performance of a smartphone by 4x, that’s not a trivial achievement,” Mark N. Vena, a senior analyst at Moor Insights & Strategy, told TechNewsWorld.
The big news from the event, though, was Apple finally making iPhones that support 5G.
“This marks the start of a new era for iPhone,” Apple CEO Tim Cook said at a virtual online event announcing the new products.
Bringing 5G to the iPhone is a huge moment for the product, he noted.”5G will bring a new level of performance for downloads and uploads, higher quality video streaming, more responsive gaming, real time interactivity and so much more,” Cook proclaimed.
“5G networks are more advanced with lower latency and less network congestion so you can get higher network speeds even in densely populated areas,” he continued.
“And 5G even helps protect your privacy and security,” Cook explained, “since you won’t need to connect to unknown, unsecured public Wi-Fi hot spots as often.”
Apple is coming to the 5G table after many of its rivals have been eating there for some time. “Apple couldn’t hold off on 5G much longer, despite the varying degrees of 5G availability and performance” said Gerrit Schneemann, a principal analyst at Omdia, a research and consulting firm based in London.
“Some markets, like the U.S. and China are heavy into the 5G cycle now,” he told TechNewsWorld. “China is an important market for Apple so not having 5G handsets is probably not the best position for them to be in.”
All the new iPhone models support both flavors of 5G: sub-6 GHz and mmWave. “They’re future-proofing their lineup,” Schneemann said. “They’re setting up those devices to be in the market for quite some time and support 5G in the market whenever and wherever it becomes available.”
The four new models introduced by Apple are:
The 12 and mini (pictured above), which will be offered in white, black, blue, green and (Product) Red, can be pre-ordered on Oct. 16, for shipment Oct. 23. The Pro and Max, which are offered in graphite, silver, gold and Pacific blue, can be pre-ordered Nov. 6, for shipment on Nov. 13.
The iPhone mini is likely to get a lot of consumer attention. “There’s been tremendous pent-up demand for a high-quality iPhone in a smaller form factor,” said Tim Bajarin, president of Creative Strategies, a technology advisory firm in Campbell, Calif.
“And then when you add 5G and a really good camera imaging system for $699, the mini will be a big hit,” he told TechNewsWorld.
All the models have 12-megapixel front and back camera systems. The 12 and mini have dual systems on the back with Ultra Wide (f2.4) and Wide (f1.6) lenses. The Pro and Max have a three-lens system on the back, with ultra-wide, wide and telephoto (f2.2) lenses.
The Pro and Max also have features appealing to professional photographers, such as Apple ProRAW. RAW is a photo format used by professional photographers. ProRAW integrates RAW photography with computational photography to give a photographer greater control of the images they capture.
“Apple is really going after the professional camera market,” Moor’s Vena said. “The things you can do with computational computing are rivaling what you can do with the best DSLR cameras out there.”
“Apple isn’t content making a phone with a really good camera,” he added. “They want to make it your default camera.”
All the phones also support Dolby Vision HDR video. “That’s a really high-resolution professional format,” Vena explained. “Shooting in 4K HDR and Dolby Vision, you will get the same kind of look and feel of a professional movie.”
The new Pro and Max iPhones also have LiDAR scanners. It measures the time it takes for light to travel to an object and travel back. With it, the iPhone can understand its surroundings and build a depth map of it. The technology is a tip-off of Apple’s future plans for the iPhone.
“It’s really important in augmented reality,” explained Bajarin, of Creative Strategies. “In augmented reality apps it creates realistic images and information that sits on top of those images.”
“By bringing LiDAR into the higher end iPhone models, it’s a hint that they’re getting closer to expanding what they’re doing in AR,” he added.
LiDAR can also be used to improve Night mode portraits and focusing faster in low light conditions.
In addition to the new iPhones, Apple introduced a smaller version of its HomePod smart speaker.
Like its big brother, the HomePod mini ($99) is wrapped in a mesh fabric created for its acoustic properties. On top of the unit is a backlit touch surface with controls for volume, play and pause, and illuminates when Siri, Apple’s digital assistant, is summoned.
Through the mini, you can play music, get answers to questions, control home devices with your voice and interact with other Apple devices. With multiple minis, you can seamlessly pipe music into multiple rooms and even use them as an intercom system.
“Google and Amazon make small speakers, but they don’t have the quality that I’ve heard in the HomePod mini,” Bajarin maintained.
“I think Apple is being aggressive with the pricing of it, given the quality of the device,” he said.
“Then you add the fact that it connects to every device you use within your Apple ecosystem,” he added. “At $99, it’s just a great buy.”
It probably shouldn’t, but it routinely astonishes me how much we live on the Web. Even I find myself going entire boots without using anything but the Web browser. With such an emphasis on Web-based services, one can forget to appreciate the humble operating system.
That said, we neglect our OS at the risk of radically underutilizing the incredible tools that it enables our device to be.
Most of us only come into contact with one, or possibly both, of two families of operating systems: “House Windows” and “House Practically Everything Else.” The latter is more commonly known as Unix.
Windows has made great strides in usability and security, but to me it can never come close to Unix and its progeny. Though more than 50 years old, Unix has a simplicity, elegance, and versatility that is unrivalled in any other breed of OS.
This column is my exegesis of the Unix elements I personally find most significant. Doctors of computer science will concede the immense difficulty of encapsulating just what makes Unix special. So I, as decidedly less learned, will certainly not be able to come close. My hope, though, is that expressing my admiration for Unix might spark your own.
If you haven’t heard of Unix, that’s only because its descendants don’t all have the same resemblance to it — and definitely don’t share a name. MacOS is a distant offshoot which, while arguably the least like its forebears, still embodies enough rudimentary Unix traits to trace a clear lineage.
The three main branches of BSD, notably FreeBSD, have hewn the closest to the Unix formula, and continue to form the backbone of some of the world’s most important computing systems. A good chunk of the world’s servers, computerized military hardware, and PlayStation consoles are all some type of BSD under the hood.
Finally, there’s Linux. While it hasn’t preserved its Unix heritage as purely as BSD, Linux is the most prolific and visible Unix torchbearer. A plurality, if not outright majority, of the world’s servers are Linux. On top of that, almost all embedded devices run Linux, including Android mobile devices.
To give as condensed a history lesson as possible, Unix was created by an assemblage of the finest minds in computer science at Bell Labs in 1970. In their task, they set themselves simple objectives. First, they wanted an OS that could smoothly run on whatever hardware they could find since, ironically, they had a hard time finding any computers to work with at Bell. They also wanted their OS to allow multiple users to log in and run programs concurrently without bumping into each other. Finally, they wanted the OS to be simple to administer and intuitively organized. After acquiring devices from the neighboring department, which had a surplus, the team eventually created Unix.
Unix was adopted initially, and vigorously so, by university computer science departments for research purposes. The University of Illinois at Champaign-Urbana and the University of California Berkeley led the charge, with the latter going so far as to develop its own brand of Unix called the Berkeley Software Distribution, or BSD.
Eventually, AT&T, Bell’s successor, lost interest in Unix and jettisoned it in the early 90s. Shortly following this, BSD grew in popularity, and AT&T realized what a grave mistake it had made. After what is probably still the most protracted and aggressive tech industry legal battle of all time, the BSD developers won sole custody of the de facto main line of Unix. BSD has been Unix’s elder statesmen ever since, and guards one of the purest living, widely available iterations of Unix.
My conception of Unix and its accompanying overall approach to computing is what I call the “Unix Way.” It is the intersection of Unix structure and Unix philosophy.
To begin with the structural side of the equation, let’s consider the filesystem. The design is a tree, with every file starting at the root and branching from there. It’s just that the “tree” is inverted, with the root at the top. Every file has its proper relation to “/” (the forward slash notation called “root”). The whole of the system is contained in the directories found here. Within each directory, you can have a practically unlimited number of files or other directories, each of which can have an unlimited number of files and directories of its own, and so on.
More importantly, every directory under root has a specific purpose. I covered this a while back in a piece on the Filesystem Hierarchy Standard, so I won’t rehash it all here. But to give a few illustrative examples, the /boot directory stores everything your system needs to boot up. The /bin, /sbin, and /usr directories retain all your system binaries (the things that run programs). Configuration files that can alter how system-owned programs work live in /etc. All your personal files such as documents and media go in /home (to be more accurate, in your user account’s directory in /home). The kind of data that changes all the time, namely logs, gets filed under /var.
In this way, Unix really lives by the old adage “a place for everything, and everything in its place.” This is exactly why it’s very easy to find whatever you’re looking for. Most of the time, you can follow the tree one directory at a time to get to exactly what you need, simply by picking the directory whose name seems like the most appropriate place for your file to be. If that doesn’t work, you can run commands like ‘find’ to dig up exactly what you’re looking for. This organizational scheme also keeps clutter to a minimum. Things that are out-of-place stand out, at which point they can be moved or deleted.
Another convention which lends utility through elegance is the fact that everything in Unix is a file. Instead of creating another distinct digital structure for things like hardware and processes, Unix thinks of all of these as files. They may not all be files as we commonly understand them, but they are files in the computer science sense of being groups of bits.
This uniformity means that you are free to use a variety of tools for dealing with anything on your system that needs it. Documents and media files are files. Obvious as that sounds, it means they are treated like individual objects that can be referred to by other programs, whether according to their content format, metadata, or raw bit makeup.
Devices are files in Unix, too. No matter what hardware you connect to your system, it gets classified as a block device or a stream device. Users almost never mess with these devices in their file form, but the computer needs a way of classifying these devices so it knows how to interact with them. In most cases, the system invokes some program for converting the device “file” into an immediately usable form.
Block devices represent blocks of data. While block devices aren’t treated like “files” in their entirety, the system can read segments of the block device by requesting a block number. Stream devices, on the other hand, are “files” that present streams of information, meaning bits that are being created or sent constantly by some process. A good example is a keyboard: it sends a stream of data as keys are pressed.
Even processes are files. Every program that you run spawns one or more processes that persist as long as the program does. Processes regularly start other processes, but can all be tracked by their unique process ID (PID) and grouped by the user that owns them. By classifying processes as files, locating and manipulating them is straightforward. This is what makes reprioritizing selfish processes or killing unruly ones possible.
To stray a bit into the weeds, you can witness the power of construing everything as a file by running the ‘lsof’ command. Short for “list open files,” ‘lsof’ enumerates all files currently in use which fit certain criteria. Example criteria include whether or not the files use system network connections, or which process owns them.
The last element I want to point out (though certainly not the last that wins my admiration) is Unix’s open computing standard. Most, if not all, of the leading Unix projects are open source, which means they are accessible. This has several key implications.
First, anyone can learn from it. In fact, Linux was born out of a desire to learn and experiment with Unix. Linus Torvalds wanted a copy of Minix to study and modify, but its developers did not want to hand out its source code. In response, Torvalds simply made his own Unix kernel, Linux. He later published the kernel on the Internet for anyone else who also wanted to play with Unix. Suffice it to say that there was some degree of interest in his work.
Second, Unix’s openness means anyone can deploy it. If you have a project that requires a computer, Unix can power it; and being highly adaptable due to its architecture, this makes it great for practically any application, from tinkering to running a global business.
Third, anyone can extend it. Again, due to its open-source model, anyone can take a Unix OS and run with it. Users are free to fork their own versions, as happens routinely with Linux distributions. More commonly, users can easily build their own software that runs on any type of Unix system.
This portability is all the more valuable by virtue of Unix and its derivatives running on more hardware than any other OS type. Linux alone can run on essentially all desktop or laptop devices, essentially all embedded devices including mobile devices, all server devices, and even supercomputers.
So, I wouldn’t say there’s nothing Unix can’t do, but you’d be hard-pressed to find it.
Considering the formidable undertaking that is writing an OS, most OS developers focus their work by defining a philosophy to underpin it. None has become so iconic and influential as the Unix philosophy. Its impact has reached beyond Unix to inspire generations of computer scientists and programmers.
There are multiple formulations of the Unix philosophy, so I will outline what I take as its core tenets.
In Unix, every tool should do one thing, but do that thing well. That sounds intuitive enough, but enough programs weren’t (and still aren’t) designed that way. What this precept means in practice is that each tool should be built to address only one narrow slice of computing tasks, but that it should also do so in a way that is simple to use and configurable enough to adapt to user preferences regarding that computing slice.
Once a few tools are built along these philosophical lines, users should be able to use them in combination to accomplish a lot (more on that in a sec). The “classic” Unix commands can do practically everything a fundamentally useful computer should be able to do.
With only a few dozen tools, users can:
Another central teaching of Unix philosophy is that tools should not assume or impose expectations for how users will use their outputs or outcomes. This concept seems abstract, but is intended to achieve the very pragmatic benefit of ensuring that tools can be chained together. This only amplifies what the potent basic Unix toolset is capable of.
In actual practice, this allows the output of one command to be the input of another. Remember that I said that everything is a file? Program outputs are no exception. So, any command that would normally require a file can alternatively take the “file” that is the previous command’s output.
Lastly, to highlight a lesser-known aspect of Unix, it privileges text handling and manipulation. The reason for this is simple enough: text is what humans understand. It is therefore what we want computational results delivered in.
Fundamentally, all computers truly do is transform some text into different text (by way of binary so that it can make sense of the text). Unix tools, then, should let users edit, substitute, format, and reorient text with no fuss whatsoever. At the same time, Unix text tools should never deny the user granular control.
In observing the foregoing dogmas, text manipulation is divided into separate tools. These include the likes of ‘awk’, ‘sed’, ‘grep’, ‘sort’, ‘tr’, ‘uniq’, and a host of others. Here, too, each is formidable on its own, but immensely powerful in concert.
Regardless of how fascinating you may find them, it is understandable if these architectural and ideological distinctions seem abstruse. But whether or not you use your computer in a way that is congruent with these ideals, the people who designed your computer’s OS and applications definitely did. These developers, and the pioneers before them, used the mighty tools of Unix to craft the computing experience you enjoy every day.
Nor are these implements relegated to some digital workbench in Silicon Valley. All of them are there — sitting on your system anytime you want to access them — and you may have more occasion to use them than you think. The majority of problems you could want your computer to solve aren’t new, so there are usually old tools that already solve them. If you find yourself performing a repetitive task on a computer, there is probably a tool that accomplishes this for you, and it probably owes its existence to Unix.
In my time writing about technology, I have covered some of these tools, and I will likely cover yet more in time. Until then, if you have found the “Unix Way” as compelling as I have, I encourage you to seek out knowledge of it for yourself. The Internet has no shortage of this, I assure you. That’s where I got it.
More people than ever now work from home as a result of the pandemic, and this shift has necessitated a rethinking of how home offices are designed and furnished. The E-Commerce Times checked in with providers of workspace furniture and accessories for insights on how the adaptation to WFH has affected home office setups.
“It seems that the new focus on home office is here to stay,” Dave Adams, vice president of marketing for home furnishings provider BDI, told the E-Commerce Times. “Even post-pandemic, the way that people work will be shifted. Employers are finding that their employees can be productive from home. Large companies have invested heavily in remote work tools, and there’s no sign that they’ll be returning to the old way of doing business anytime soon.”
Contemporary home “offices,” in fact, might take the form of everything from the corner of a bedroom to the dining room table, and people are coming up with a variety of solutions to fit their own particular needs and spaces.
“In recent months, it seems like the key to working from home is creativity,” explained Adams. “People are creating workspace where there is no space, using anything from the coffee table to the dining table to an ironing board. We have seen some very creative solutions.
“A recent report stated that 46 percent of those who had never worked from home before now plan on working from home more often in the future. As people adapt to this new normal, they are searching for more permanent workspace solutions and we have seen a tremendous spike in interest for home office furniture.”
One thing people working at home are finding is that doing more with less can be a helpful strategy.
“Not everyone has the room for a dedicated home office,” said Adams. “The new corner office may be the corner of your bedroom. Smaller-scale desks and writing tables are not only a space-saving solution; they are also an economical way to make the most use of available space.
“Even though you are working at a smaller scale, however, you should not have to sacrifice features. There are smaller-scale standing and seated desks that still include important features, such as wire management, keyboard drawers and additional storage options.”
Small pieces of furniture with multiple uses are becoming central to this new conception of the home office.
“With Mom, Dad and even the kids working from home, it’s understandable that everyone may not get their own dedicated workspace,” Adams added. “We have seen a surge in our small and mobile laptop tables that allow people to work productively while at the sofa, yet move out of the way when they aren’t needed.”
Laptop tables and desks can be particularly useful, since they can convert even bed or sofa space into a kind of office.
“Most of our customers are using it for their work,” Ashley Janssen, head of the communications department at WoodenLapDesk, told the E-Commerce Times, referring to the lap desk her company sells. “They are using it in their backyard or in their bedroom, or even as a semi-standing desk. Some of our customers plant the lap desk on their current desk and create a standing desk.”
People are finding lap desks, in fact, can be used for non-work-related pursuits, as well.
“Other purposes of the lap desk are for personal use, like drawing, sketching, watching Netflix series in your bedroom, or simply as a breakfast tray in bed,” said Janssen.
Contemporary home office furniture and design must, ultimately, take into account changes in the way people work, communicate, and structure their lives.
“The demands of the home office have changed over time,” explained Adams. “People are working with a lot less paper, meaning the need for file storage has lessened for many. Computers have gotten smaller, and it’s almost unique to see a full desktop and CPU tower. With the advent of Bluetooth, there are fewer wires to manage and connections to juggle.”
Storage is also something that must be carefully considered when setting up a home office.
“In order to create an efficient and organized home office, you have to have more than just a desktop and four legs,” said Adams. “Consumers need to have integrated storage for supplies and accessories. Wiring needs to be properly routed and easily accessible. Keyboard drawers are a feature everyone seems to gravitate towards so that they can — even just symbolically — close up shop at the end of the day. Solutions that are expandable, with accessory pieces that can be added to as their needs change, are very popular.”
Though the pandemic was what initially moved many peoples’ work into their homes, this new way of working may well become, to some degree, permanent. As people realize the magnitude of this shift, they’re rethinking their initial strategies and plans for home offices.
“When COVID lockdowns initially began in March, we saw an uptick in single item orders — specifically for office chairs — essentially simple items to supplement a home office set up and to help individuals get through the lockdown period until they could get back into the office,” Verity Sylvester, co-founder of Branch, told the E-Commerce Times.
“Over the last few months, we have noticed a shift in mentality to where customers are now purchasing complete office setups — chairs, desks, standing desks, filing cabinets, power accessories, etc. — as they realize that working from home is not just a short stint. It will be a longer and more permanent part of the way individuals work,” she observed.
The shift to working at home will likely even change how and where people decide to live in the first place.
“We believe the demand for home offices, and for more complete home offices, is only going to grow,” said Sylvester. “Individuals are at the point where they are now realizing that working from home is more permanent, and they are adapting how they live to accommodate how they want to work, moving into larger apartments and/or homes that can accommodate complete home offices.”
Home offices, as they become the norm, could mean a cultural shift in how houses are designed and what features buyers seek when they’re looking for a home.
“New home buyers are specifically shopping for dedicated rooms to create home offices in so they can have some separation between work and real life,” said BDI’s Adams. “The blurring of work and home lives has proved to be troublesome for many. As these dedicated offices become established, the need for larger-scale home systems will flourish.”
Throughout the early fall, analysts have been treated to a continuous stream of announcements from the CRM community, especially Salesforce and Oracle. New product availability belies the facts on the ground of an economy hobbled by a pandemic. It’s a gusher of good technology intended for a market eager to snap it up. That said, I am not sure about the market and we might be looking at a technology glut.
Glut is a strange word in software since there’s no inventory to back up as you’d see in a more conventional glut. Over the summer gas process were very low because producers kept pumping in the face of declining demand which created a glut. Software is different, especially in the cloud era. When demand happens it’s a simple transaction and users can begin almost immediately to use the technology. But that doesn’t mean you can’t have a glut.
My research this year establishes a disconnect between the wonderful features and functions of the new technologies and the realities of how companies use them — or not. In surveys of over one thousand end users from as many companies of all sizes, including multi-billion-dollar ones, the evidence shows that this technology is not reaching users.
They complain of stand-alone systems that aren’t integrated and of running as many as eight apps at once trying to do their jobs. People understandably run out of time in a day too. They work long hours, do business from their phones at the gym, their children’s school activities, and even in the bathroom.
The old hypothesis explaining this lack of adoption was that software was too costly and took too much time to install or maintain. Back in the day that was true, but no more. Cloud computing is easy to install and use, and cheap and while we’re at it, and vendors refresh apps with new features and functions multiple times per year. That’s a far cry from the annual upgrade season we saw twenty years ago.
You might be tempted to blame economic conditions for slack demand if it exists, but that’s not it. My surveys straddle COVID’s before and after. In truth, for the two plus decades that I’ve been following CRM, there’s always been adoption reluctance emanating from the front office. Rather than accepting CRM’s different approach to doing business, many people still resist it, preferring to go with their gut or stick with manual systems, because updating CRM is tedious.
Many businesses have bought into CRM, of course. It isn’t an 80-billion-dollar industry for nothing. But for many years we’ve seen that only about a quarter of organizations use CRM appropriately. Others use it for simple record keeping, or barely use it at all. A great tell for this is that in one of my studies, CRM ranked only fourth in importance as a tool people use daily. Email ranked higher. Email.
This is important to everyone because the first decades of CRM were largely about gathering and consolidating data that people could use in their customer-facing jobs. But today, a lot of those jobs have been automated away. When was the last time you went to a website or made a call for support and interacted with a person? You could, but first you needed to get through a very good self-service system that likely solved your problem sans human.
We’ve turned a corner in CRM. Two decades ago, you could put off adoption by saying the stuff doesn’t work or it doesn’t fit my oh-so-unique business. Not so much anymore.
This fall companies like Salesforce are introducing advanced vertical industry specific solutions while Oracle continues to refine its platform, analytics and infrastructure to do much the same.
All of this is done in an effort to boost the prospects of what’s been called the digital disruption, but is increasingly becoming just business as usual, which brings us to the glut. Currently available CRM systems might now outstrip the abilities of customers to adopt them.
It’s tempting to say that customers just need to get down to work, but that ignores a pressing reality. If my data is right, the people who need CRM are already under water. They’re working too hard and spending more than the eight hours a day that they’re paid for. So there’s scarcely any time to take on something new. That’s partly why my studies indicate relative complacency with the cobbled together systems they’re using — at least they can do their jobs, make their quotas and go home.
Learning something new at work, no matter how good it is, can be a daunting task and one that needs to be supported by organizations that have used CRM to reduce their overhead. Learning might mean bringing on a few more people to spread the work around and make feasible learning something new.
This should surprise exactly no one, though business has often not been good at this kind of transition. In the 1970s and ’80s car makers went through a similar turning point. It was a time when they were converting from mostly rear-wheel drive cars to front wheel-drive. At the same time, they were obsessed with quality control, reimagining the manufacturing process, and introducing robots. They were also fighting enhanced foreign competition for the first time.
There’s a reason for new technology adoption reluctance. During that era U.S. auto makers lost about half of their market share.
That’s about where we are in CRM today. Some companies are adopting digital disruption strategies early, while others haven’t kept up. The big difference today however is that the big vendors are taking steps to protect their customers from the downside risks. They’re spinning up online training functions and bringing in expert partners to help.
Ultimately, it’ll take additional resources from both vendors and customers to bring us into the digital CRM era. So, as I look at all of the new technology available this fall, I smile at vendor creativity and I hope that ingenuity extends to new approaches to implementation, training, and even financing.
Google on Tuesday announced its rebranding of G Suite as Google Workspace, along with its newly integrated desktop environment that is designed for enhanced collaboration and communications, marking its latest bid to challenge Microsoft 365.
Google Cloud officials said the Workspace environment will create a new user experience that integrates meetings, docs, messaging and tasks, in a bid to help remote workers and students operate in a more productive, secure and collaborative environment.
“This is the end of the ‘office’ as we know it,” Javier Soltero, vice president and general manager of Google Workspace, said in a statement. “From here on out, teams need to thrive without meeting in person, protect their time to focus on the most impactful work, and build human connection in new ways.”
Soltero, a former Microsoft executive in charge of Cortana and Outlook, joined Google in late 2019 to head up G Suite. He was already making plans to better position the productivity bundle to compete against Microsoft 365, but the COVID-19 pandemic and subsequent work-from-home requirements at many international companies created an even greater demand or more collaborative tools that could help workers operate on projects across various time zones.
Among new features in Google Workspace, users will gain access to the following functions:
Docs, Sheets and Slides now contain “linked previews” that allow users to preview the content of links without leaving the original document. Google says this feature will save time switching between apps and tabs and allow more time to get work done.
“Smart chips” will be embedded in Docs, Sheets and Slides. Therefore, when a user @mentions a person inside a document, a pop-up display appears to add additional context and suggest various actions, for example whether to share the document.
In the coming weeks, Google Workspace will allow users to dynamically create and collaborate on various documents (in Docs, Sheets and Slides) within a room in Chat. However, users will not need to switch tabs or tools.
Within the next few months, Google will roll out Meet picture-in-picture to Docs, Sheets and Slides, following the previously announced introduction of the functionality to Chat and Gmail.
Google Workspace plans start at US$6 per person, per month, and go up from there. Higher-priced tiers have access to more features, including larger video conferencing services and enhanced storage capabilities. Google Workspace is available as free 14-day trial, with a monthly charge thereafter.
In July Google announced a series of changes to G Suite that were designed to enhance collaboration by better integrating its various communication tools.
For example, allowing users to access Google Meet directly from the Gmail inbox when used on the web. That announcement was preceded in June by a move to integrate Chat into Gmail on the web. The Meet and Chat integrations on the web were later expanded to iOS and Android platforms as well. Collaboration features in Chat rooms were also enhanced in the June integration with the addition of shared files and tasks.
According to Google, currently more than 2.6 billion users across consumer, enterprise and educational settings use the company’s productivity apps every month. As of March, more than six million corporate customers paid to use G Suite; and as of June, 140 million students and teachers used G Suite for Education for a range of activities, including creation, collaboration and communication.
Google said it plans to bring Google Workspace to its nonprofit and educational customers in the coming months. In the meantime, customers can continue to access tools through G Suite for Education or G Suite for Nonprofits.
At least one industry analyst sees the goal of Google Workspace to make the experience of the various G Suite apps more seamless for users and make it easier for customers to be better able to navigate between the various tools more intuitive.
“I think it’s simplification,” Craig Roth, research vice president, technology and service providers at Gartner told the E-Commerce Times. “There are a lot of components in the suite or any office suite. Having some sort of hub that connects it all is important for productivity.”
Microsoft declined to comment on the Google Workspace announcement, but the company has made a number of additions to its productivity suite since the beginning of the pandemic. In July it introduced a series of enhancements to its Teams collaboration software, including Together mode, for virtual meetings — and just last month Microsoft launched “virtual commute,” to help remote workers bridge the gap between the end of their work day and the return to family tasks.
As Andy Williams famously sang in his seasonal classic of the same name, the year-end holiday season is considered the “most wonderful time of the year.” However, the year 2020 has sung a different tune thus far from every other year — and as many businesses that closed during the first wave of the pandemic still struggle to recover, it’s yet to be seen if this year-end holiday season lives up to Williams’ iconic lyrics.
The restrictions forcing people to work and shop from the confines of their homes accelerated an already advancing e-commerce market that was years in development. As a result, the pandemic has created a case study for e-commerce and retail of the 21st century. Moving forward, this holiday shopping season will likely witness one of the next major shifts toward a different future in retail.
It’s been well documented in the retail industry that during the last decade, e-commerce has encroached on brick-and-mortar sales.
In 2019, for example, consumers spent US$601.75 billion online, compared with $136.4 billion in 2007, according to a Digital Commerce 360 report — a 77 percent increase.
Not only have total sales skyrocketed, but so has the e-commerce share of total retail sales. Brick-and-mortar shops might not be out of the game quite yet, but the evident growing demand for the variety of items and convenience of buying online certainly looms large.
Fast forward to Q1 of 2020. The pandemic created a unique scenario whereby e-commerce sellers sold in record numbers. With much of the population relegated to staying at home during the peak of the first wave, consumers went on their devices at home to shop. During Q2 of 2020, consumers dished out $200.72 billion for online purchases with U.S. retailers, an increase of 44.4 percent from the $138.96 billion spent in Q2 of 2019.
This year, retail e-commerce sales totals will depend largely on the eagerness of shoppers to spend large amounts on gifts if they’re facing reduced disposable income — not to mention fears about being in crowded spaces.
Despite these concerns, Salesforce predicts that up to 30 percent of global retail sales will be made through digital channels this upcoming holiday season. Meanwhile, Deloitte estimates that holiday e-commerce sales will reach between $182 billion and $196 billion, which is a 25 to 35 percent increase from 2019’s $145 billion.
As COVID-19 drives more e-commerce sales, retailers must make new considerations to prepare accordingly, which translates to new logistical considerations, ranging from site infrastructure to last-mile delivery.
For retailers looking to cash in during the opportune holiday shopping season, it’s imperative first to review the existing retail infrastructure and assess its preparedness for a larger volume of consumers.
During the last few years, retailers have invested significantly to improve the customer experience and cater to the growing e-commerce demand. Walmart is one example of a big-box retailer that has successfully expanded an infrastructure now capable of offering quick delivery and a wide variety of products on a platform vying with Amazon.
Although most retailers don’t have the financial capacity to upgrade their e-commerce platforms to the extent that these giants are able to, finding solutions for sustaining uptime, ensuring fast page load times, and protecting consumers will all be especially critical for the anticipated increase of the 2020 holiday shopping season. Shopify, which saw enormous revenues in Q2 of 2020, remains the e-commerce platform white label leader and will likely be the crutch on which many proprietors depend in the coming weeks.
Some e-commerce businesses may also consider invisible technologies like virtual reality and automation processes to enhance the online shopping experience. Or consider social commerce to promote their products — a growing e-commerce channel.
On the delivery side of logistics, e-commerce sellers have some considerations to mull over regarding the cost of shipping, which has risen significantly for air and last-mile delivery. While many sellers spent time and money upgrading their last-mile delivery during the last two years, the pandemic has posed some new supply chain challenges.
Last-mile delivery, which is already considered the most inefficient part of the fulfillment process, will see much more demand from consumers this season — and questions remain if the disrupted supply chains will be able to keep up.
The cost of shipping by air and sea rose by several percentage points during the summer. These increased costs could trickle down to the consumer in some cases but could also end up as costs that sellers will need to absorb in order to appease shoppers.
How e-commerce sellers respond to the COVID-19 challenges will dictate what’s to come. Because the pandemic could last well into — and perhaps beyond — 2021, the 2020 holiday shopping season will likely set important precedents both in the market and logistically.
Brick-and-mortar stores may not disappear from our lives entirely, but their slow demise could be hastened by the prolonged imposition of social restrictions brought on as a result of the pandemic.
Before COVID-19, brick-and-mortar stores experienced a gradual decline that matched the increase in e-commerce sales. So with a sustained pandemic, we will likely continue to see the sharp acceleration of e-commerce, where consumers of all generations grow more accustomed to online shopping.
Even recurrent shopping for groceries in the U.S. has now become a booming segment in e-commerce. As this trend continues, logistics will have to adjust, which means predominantly brick-and-mortar retailers will shift their real estate holdings from physical shops to distribution centers for their e-commerce infrastructure.
Only time will tell what happens with the 2020 holiday shopping season, but the developments in e-commerce during the last few years are now blossoming to a greater degree because of the conditions forced by the pandemic.
Charles Darwin famously said it best: “Survival of the fittest.” As the limitations we face pave the way for the dominance of e-commerce, it’s the retailers who can best adapt to the changing logistical and consumer demands that will lead the industry this holiday season and for years to come.
Marketers estimate that 61 percent of students started the new school year where they ended last school year — remotely. So why are retailers still featuring traditional school supplies online, offering discounts and promotions retroactively, rather than using real-time data to inform decisions?
That school advertising example seems out of joint in light of the rebounding U.S. economy. It also suggests that advertisers are missing the mark in targeting consumers more effectively as the shift in both work from home and learn at home paradigms become the new normal.
Marketers have to adapt to changing circumstances. They also have to play better hunches when handling supply and demand issues. In this constantly changing marketplace fueled by the pandemic, supply and demand are moving and changing constantly.
The approach retailers take when it comes to back-to-school shopping is a strong example of how it should not be done, according to Chris Dessi, vice president of Americas at Productsup.
“Just because businesses are geared up to sell online, does not mean they are doing it in the most effective way,” he told the E-Commerce Times.
Instead, advertisers should target buyers of traditional classroom shopping as two distinct groups. One group is for consumers buying supplies for those physically starting the year in schools. The other group is for those consumers who need supplies to enter a remote learning environment, he suggested.
In some instances, hybrid models have been enacted or start dates are delayed. Because of these ever-changing factors, retailer strategies just are not up to par when it comes to catering to an individual’s needs, he noted.
“If you go to any major retailer’s site, you will see a colorful scheme of backpacks, shoes, lunch boxes — items that paint this portrait that life is normal, when the reality is, it is not,” said Dessi.
U.S. e-commerce is experiencing rapid growth with no sign of slowing down. Retailers and brands need to adjust their digital infrastructure and tactics as they prepare for their holiday marketing and ad campaigns, according to Melissa Sargeant, CMO of Litmus.
Brands that create authentic, personal customer experiences will emerge from the economic crisis faster and better positioned than before,” she told the E-Commerce Times.
This is especially important as analysts project a roughly 25 percent drop in U.S. marketing spend. That shows a need to emphasize the value of authentic brand experiences while also maximizing budget and efficiency, she added.
So why are advertisers not fully capitalizing on the new situation? Many marketers have not yet figured out how.
“In the past, it was pretty simple to predict what supplies students need for the school year, allowing for blanket promotions and a flurry of deals. Now, it is a much more difficult situation that advertisers have never dealt with before,” Productsup’s Dessi observed.
Customer experience can make or break how a user views a brand. Offering irrelevant deals can be a huge turn off and encourage the shopper to look elsewhere. For instance, a parent trying to prep for three of their kids’ virtual schooling wants to see deals on technology and desks, not discounts on first-day outfits, he explained.
On the flip side, some kids were excited to sport a new look the first day of in-person school. So their parents were going to buy new clothes with or without a deal. This tricky situation exists where retailers are not showing shoppers relevant products, while also leaving money on the table by offering needless discounts.
“We need to also consider the power of local advertising. Savvy digital marketers can market to people based on store locations, leveraging inventory details and powerful calls to action,” Dessi said.
Each school district has different and ever-changing policies. So marketers cannot determine where their buyers are. Some school policies are changing daily, even hourly. There are too many moving parts to get out ahead of it, he suggested.
A lot of retailers’ backend systems and processes for choosing products and sales are still fairly manual, according to Mark William Lewis, founder/CTO of Netalico Commerce. It is based on individuals’ decisions. Plus, the supply and demand this year has been extremely unpredictable due to the unprecedented effects of the global pandemic.
“Retailers have to make predictions months in advance about what the demand will be to stock the right products due to the supply chain constraints. That is difficult to do when state and local governments are making decisions about closures daily,” he told the E-Commerce Times.
Take, for example, the ongoing situation with laptops fueled by the two parallel supply and demand streams of work from home and learn at home. Few ads offered discounts and promotions retroactively.
This all comes down to poor feed management. Not enough information flows between the supplier, retailer, and customer. With laptop shortages rampant across the country, it is important for retailers to communicate inventory to their customers or else face lost sales, noted Productsup’s Dessi.
“In the same vein, once retailers realized that laptops were a hot-ticket item, it would be in their best interest to ensure there are not discounts on these items. Few advertisements offered discounts and promotions retroactively, showcasing an inability to adjust based on real-time events,” he explained.
The opportunity here is based on agility, he continued. The most agile marketers always win. Proper tech, supporting proper marketing is the game.
“Those who do not have proper tech implemented will dry up faster than pre-COVID-19,” said Dessi.
Global attrition of marketers who are not tech-savvy is becoming apparent. It is an expedited Darwinism that will damage previously insulated brands and retailers, he noted.
“Whereas those who were already playing in the feed management and optimization space are capitalizing on the chaos, and not just beating the competition. They are eviscerating them,” he quipped.
U.S. online sales are up nearly 80 percent for June year-over-year and, going into the holiday season, this trend is expected to continue. This stresses the importance of brands gathering as many insights about their target audience through email to then use to inform other channels, according to Litmus’ Sargeant.
If marketers neglect this part of the process in their holiday marketing efforts, their multichannel marketing plan will miss the mark. Instead, they need to invest in the audience insights garnered via email to inform the rest of the brand’s marketing strategy, she advised.
“Then, they will know how to maximize their efficiency and budget,” she said. “Email needs to be front and center within every retail and e-commerce marketing mix.”
Marketers have to create more authentic human-to-human connections and experiences with customers. Consumers expect brands to know what they want, she added.
Adapting requires essential components for marketers, cautioned Dessi. They need real-time consumer data in order to ensure pricing models are accurate. This makes it possible for consumers to actually see the products they need.
“This is how retailers can stand out against competitors,” he said. “When you are relying on online channels to make sales, you simply have to have a superb experience and exceed customer expectations.”
In today’s world, marketers in particular and merchants, in general, can expect more instances of unexpected, unsteady, and unprecedented changes in supply and demand. Retailers need to get ahead of that now, ensuring their feed management and real-time inventory processes are up to date, Dessi recommended.
One way to do this is to partner with the right e-commerce solution providers. Part of that includes using real-time data, which is the only way to inform backend decisions while ultimately satisfying the customer, he said.
“As more brands rightfully pivot to e-commerce, standing out against competitors is going to come down to relevance, showing customers products that are applicable to their day-to-day life, while ensuring product information is accurate and updated on an on-going basis,” Dessi added.
Aggressive pendulum swings of inventory, price changes, product availability of the online world is now mimicked in the real world. They are one, he noted.
Over the past few years, the concept of “zero trust” architecture has gone through a number of evolutionary phases. It’s gone from being the hot new fad, to being trite (in large part due to a deluge of marketing from those looking to cash in on the trend), to pass, and now has ultimately settled into what it probably should have always been all along: a solid, workmanlike security option with discrete, observable advantages and disadvantages that can be folded into our organization’s security approach.
Zero trust, as the name implies, is a security model where all assets — even managed endpoints that you provision and on-premise networks configured by you — are considered hostile, untrustworthy and potentially already compromised by attackers. Instead of legacy security models that differentiate a “trusted” interior from an untrusted external one, zero trust instead assumes that all networks and hosts are equally untrustworthy.
Once you make this fundamental shift in assumptions, you start to make different decisions about what, who, and when to trust, and acceptable validation methods to confirm a request or transaction is allowed.
As a security mindset, this has advantages and disadvantages.
One advantage is that it lets you strategically apply security resources where you need them most; and it increases resistance to attacker lateral movement (since each resource needs to be broken anew should they establish a beachhead).
There are disadvantages too. For example, policy enforcement is required on every system and application, and older legacy components built with different security assumptions may not fit in well, e.g. that the internal network is trustworthy.
One of the most potentially problematic downsides has to do with validation of the security posture, i.e. in situations where the security model requires review by older, more legacy-focused organizations. The dynamic is unfortunate: those same organizations that are likely to find the model most compelling are those same organizations that, in adopting it, are likely to set themselves up for vetting challenges.
To understand the dynamic we mean here, it’s useful to consider what the next logical step is once zero trust has been embraced. Specifically, if you assume that all endpoints are potentially compromised and all networks are likely hostile, a natural and logical consequence of that assumption is to minimize where sensitive data can go.
You might, for example, decide that certain environments aren’t sufficiently protected to store, process, or transmit sensitive data other than through very narrowly defined channels, such as authenticated HTTPS access to a web application.
In the case where heavy use is made of cloud services, it is quite logical to decide that sensitive data can be stored in the cloud — subject of course to access control mechanisms that are built explicitly for this purpose and that have security measures and operational staff that you can’t afford to deploy or maintain just for your own use.
As an example, say that you have a hypothetical younger organization in the mid-market. By “younger,” we mean that maybe only a few years have passed since the organization was established. Say this organization is “cloud native,” that is, 100% externalized for all business applications and architected entirely around the use of cloud.
For an organization like this, zero trust is compelling. Since it is 100% externalized, it has no datacenters or internal servers, and maintains only the most minimal on-premise technology footprint. This organization might explicitly require that no sensitive data can “live” on endpoints or inside their office network. Instead, all such data should reside in the subset of known, defined cloud services that are explicitly approved for that purpose.
Doing this means the entity can focus all of its resources on hardening cloud infrastructure, gate services such that all access (regardless of source) is protected in a robust way, and deprioritize things like physical security, hardening the internal network (assuming there even is one), deploying internal monitoring controls, etc. Assuming a diligent, workmanlike process is followed to secure the use of the cloud components, such an approach can help focus on limited resources.
However, the above example organization doesn’t operate in a vacuum — no organization does. It works with customers, leads in the sales process, business partners, and numerous others. Since the organization is a smaller one, many of its customers might be larger organizations — potentially customers with stringent requirements about securing external service providers and validating their security. Perhaps it has a regulatory obligation to do so depending on what industry it’s in. Now some of these customers might be fully externalized but the majority won’t be — they’ll have legacy applications, unique constraints, specialized requirements, and other business reasons why they can’t support a fully external model.
What results is often a perfectly understandable, but nevertheless counterproductive, discussion at cross purposes between the organization doing the assessment (the potential customer) and the one being assessed (the service provider). A service provider, for example, might very reasonably argue that physical security controls (to pick just one example) are out of scope for the purposes of the assessment. They might argue this on the basis that the only physical security controls that matter are the ones at the cloud providers they employ since, after all, this is the only place where data is allowed to reside.
The customer on the other hand might, also reasonably, worry about aspects of physical security that do relate to the service provider’s environment. For example, visitor access to facilities where customer data might be viewed on screen, even if the data isn’t stored there. They might envision a scenario, for example, where an unauthorized visitor to the office might “shoulder surf” data as it’s being entered on screen by a legitimate user.
A conversation like the one above, even when it doesn’t become contentious, is still suboptimal for both parties involved. From the point of view of the service provider, it slows down the sales process and saps time away from engineers who would otherwise be focused on product development. From the point of view of the potential customer though, it makes them nervous about potential sources of unaccounted for risk — while simultaneously generating ill-feeling with internal business partners anxious to onboard the service and who would like to see vetting happen quickly.
So, the question then becomes: How do we effectively communicate a zero-trust model if we wish to employ one in this way? If we’re validating such an approach, how do we answer the right questions so that we can reach a determination quickly and (ideally) enable business use of the service? It turns out there are a few approaches we can leverage. None of them are rocket science, but they do require having empathy — and doing some legwork — to support.
From a service provider point of view, there are three useful principles to keep in mind: 1) be forthcoming, 2) demonstrate validation of your assumptions, and 3) back up your assertions with documentation.
By “forthcoming” I’m referring to a willingness to share information beyond what a customer might ask. If you provide a cloud SaaS as in the example above, this might mean that you are willing to share information beyond the specific set of items asked by the customer. This lets you “genericize” information even to the point of leveraging standard deliverables. For example, you might consider participating in the CSA STAR registry, prepare standard information gathering artifacts like the CSA CAIQ, the Shared Assessments Standardized Control Assessment SIG, or the HITRUST Third Party Assessment Program in the healthcare space.
The second principle, demonstrating validation, means you’ve validated the assumptions that have gone into your security model. In the example above, this means we might back up the assumption of “no data stored internally” with validation of it. An assessor from a customer is much more likely to believe the statement if a control like DLP is used to validate it.
The last point of having documentation means documenting the model you espouse. For example, if you can supply an architectural document that describes your approach: why you employ it, the risk analysis you performed beforehand, the controls in place to validate, etc. Back it up with a defined policy that sets forth security principles and expectations.
From the assessor side, there’s really only one principle, which is to embrace flexibility where you can. If you understand the intent and rigor of the controls that you would expect and a service provider happens to be meeting the same intent at the same level of rigor but in a different way than you expect, providing options for the service provider (other than requiring them to purchase and install controls they don’t need) is helpful.
Again, none of this advice is rocket science, of course. But just because it’s obvious doesn’t mean that everyone does it. By doing some legwork ahead of time and looking through an empathic lens, you can streamline the assessment process in a situation like this.
As Winston Churchill famously said, “Never let a good crisis go to waste.” Optimistically, out of COVID, the mother of modern crises, we can reasonably expect some upside.
In 2020 I have conducted as much CRM research as I have often produced in several years of more normal times. Vendors have the cash to get research done as well as a belief that, once this is over, they’ll need something new and interesting to tell their markets.
The research shows there’s important and useful information to share so, it’s a good time to offer some findings.
As I wrote in August, despite a 25-year track record, CRM is still a young industry. The surprise in the IDC CRM market share numbers compiled for 2019, the most recent numbers available, is how miniscule the leaders’ shares are. As I wrote then,
For the seventh consecutive year Salesforce leads the pack this time with 18.4 percent of the market. Other big vendors trailing the leader include, SAP 5.3 percent, Oracle 5.2 percent, Microsoft 3.7 percent, and Adobe 3.6 percent.
The other big vendors have a combined market share of 17.8 percent, less than Salesforce but in aggregate the top five vendors command considerably less than half of the market or 36.2 percent.
True, this indicates a vibrant group of smaller and successful vendors like Zoho, but it also begs an important question of why, after all of this time, is there no greater center of gravity in the market? Why is there no 800-pound gorilla with two-thirds of the market share by now?
Technology changes more rapidly than a business’s ability to absorb it — and even with cloud computing it still takes a long time to get tech into the hands of users. Our research shows that many users, especially those working remotely, are in a position where their tools don’t fit their needs. We have data that corroborates this.
In a study we completed for Zoho (n=510), we discovered that customer facing employees liked their jobs, felt their managers were good and good communicators, and they felt they understood the expectations set by their companies.
In the Zoho study, the biggest complaint was the quality of the technology that employees had to use to do their customer-facing jobs. These people were exceedingly polite, and many offered no opinion of the technologies they used every day even though they could answer all other questions we posed to them. It seemed to us they were saying nothing because they didn’t want to offend.
In another study conducted for Oracle, we discovered even more troubling information. Our subjects all had CRM, but they didn’t like it and felt it was more work to use than it should be. These users rated CRM only fourth in a list of tools they use every day to work with customers. Ahead of CRM in the rankings were generic tools like email.
The top frustrations that sellers voiced included doing repetitive administrative tasks that could be automated and updating multiple systems that ought to be connected. These two frustrations alone provide a vivid understanding of the state of the art. Repetitive tasks and multiple systems speak of an era when CRM was highly stove piped. The conclusion we draw is that these are the systems many people are still using.
Still, users keep doing the work in front of them. But this means using more labor to integrate data and systems manually when those things should be automated. It results in longer days and some bizarre activities. Indeed, 55 percent of sellers told us they rely on a combination of unintegrated applications. 72 percent of sellers need three or more screens open at one time to do their jobs. Sometimes this involves a personal handheld device.
The result is that people are working whenever and wherever they can — 60 percent say they work in the car, half have worked on vacation, a third (33 percent) have worked while at a social event, 24 percent while at the gym, and one-third have worked in the bathroom. Don’t ask.
What are we going to do about this? Is there a solution?
To a degree, it is what it is. Customer facing jobs, especially sales, involve a modicum of chaos that will always be there. That said, the situation is exacerbated because we’ve been through many generations of CRM and often the generations haven’t provided much benefit to the user. For instance, the great shift from client-server to the cloud, took many years and ultimately saved companies bundles of money. But that was a literal translation that kept stovepipes intact.
We’re just now getting to the point where there’s decent integration or at least the ability to integrate systems; automation for whole processes and data capture and analysis are available but not in many hands yet.
There’s a lot of old CRM out there that’s still running and therefore keeping new systems at bay. The subscription model should be useful here since there’s little sunk cost investment in older cloud systems should a business wish to retool.
It’s clear, though, that many of the workarounds that have kept old systems running have reached the end of the line. If people are already working in their cars, the gym or the bathroom, it’s hard to see how things get any better without replacing systems.
That requires a different approach for the customer. The old adage of “if it ain’t broke, don’t fix it,” doesn’t really apply any more. The software might not be broken but the business processes it regulates is broken in many cases.
COVID has us working from home, at odd hours and straining to get things done. Many of us would like to not have to consider retooling CRM. But there’s never been a better time to do so.
Our processes have changed and may not be able to snap back to a pre-COVID configuration. People who once commuted two hours per day have discovered how nice it is to have that time back. The video conference might not be a perfect thing, but it offers some advantages. In all of this we need to begin thinking of how we take advantage of the situation rather than simply trying to survive.
One positive from all of my research has been observing the attitudes of frontline workers. They’re resilient, like what they do and want to do more. If we didn’t have that no amount of CRM would matter. But with it business is preparing for a strong comeback. If we take advantage of the crisis.
Although software patches can be inconvenient and cumbersome for both enterprises and individual users, these fixes serve an important role in protecting computer systems, which are now vital to everyday life.
Earlier this month, a woman with a life-threatening condition passed away after hackers crashed the IT systems of a major hospital in the city of Dusseldorf.
The emergency patient could not be admitted for treatment because the Duesseldorf University Clinic could not access data after its systems had been disrupted for a week by an apparent ransomware attack. As a result, the woman was sent to a hospital 20 miles away, where doctors were not able to begin treatment for another hour. She subsequently died.
To sabotage the hospital systems, the hackers exploited a Citrix ADC CVE-2019-19781 vulnerability, which can let attackers execute their own code on hacked servers. The “misdirected” attack reportedly was originally intended for Heinrich Heine University, according to an extortion note from the hackers.
Citrix issued a patch for the vulnerability on January 24, but it appears that the hospital had not yet installed the fix.
The same Citrix vulnerability was exploited on September 9 to attack the servers of Italian eyewear giant Luxottica Group, according to Italian cybersecurity firm SecurityOpenLab. That attack forced Luxottica to shut down operations in Italy and China.
Incidents like this raise the question of why corporations do not patch vulnerabilities as soon as software manufacturers issue a fix.
“Too many organizations are overly dependent on scanners to discover what needs to be patched,” Chlo Messdaghi, VP of Strategy at Point3 Security, told TechNewsWorld. These “provide only the extreme bare minimum of information.”
Many scanners are not up to date and don’t prioritize issues, Messdaghi said. “They can’t provide a trustworthy view into what’s critical to patch immediately, what may be a lower priority but requires timely action, and what may have less risk.”
Even when IT staff patch vulnerabilities, they may not fully test those patches, she pointed out.
On the consumer side, users employ the same passwords on multiple sites or fail to implement basic cybersecurity measures such as installing antivirus or antimalware software, updating that software and their operating systems in a timely manner, and refraining from clicking on links embedded in, or attachments to, emails whose sender they have not verified, or links on web pages they visit.
“Time and again, users have proven they’ll disregard expert advice, reuse credentials, and select simple passwords,” Dan Piazza, Technical Product Manager at cybersecurity firm Stealthbits Technologies, told TechNewsWorld.
Using passwords across multiple accounts is widespread, the United States Federal Bureau of Investigation stated in a private industry notification to the financial sector earlier this month.
“Successful attacks occur more often when individuals use the same password or minor variations of the same password for various online accounts, and/or…use login usernames that are easily guessed, such as email addresses or full names,” the U.S. Securities and Exchange Commission said in a risk alert issued on September 15.
Users’ failure to follow simple security procedures has long vexed cybersecurity experts and vendors.
In 2004, Microsoft’s then-CEO Steve Ballmer called on individual users to take responsibility for their own cybersecurity. In 2010, Cisco Systems asserted that cybersecurity is everyone’s responsibility.
High-tech and cybersecurity software vendors, banks, and other organizations have been trying to get consumers to follow basic rules to protect their cybersecurity for years, but “Companies should now assume users will act against their best interests when it comes to credentials, and start forcing good habits for passwords and security,” Stealthbits’ Piazza advised.
Piazza recommended that firms trying to protect their networks against breaches consider real-time threat detection and response solutions and password policy enforcement software because “Convincing users to adhere to credential best practices is an uphill battle, so companies should start forcing good habits programmatically.”
The U.S. Cybersecurity and Infrastructure Security Agency (CISA), part of the Department of Homeland Security, on September 18 took a step toward enforcing vulnerability patching when it released an emergency directive strongly recommending both the public and private sectors patch a critical vulnerability in Microsoft Windows Netlogon Remote Protocol called CVE-2020-1472.
The Netlogon vulnerability, for which Microsoft issued a patch in August, could let attackers take over domain controllers on a victim’s network.
CISA gave public sector IT departments the weekend — until midnight September 21 — to install the patch, remove domain controllers that could not be patched, and implement technical and management controls.
It’s “virtually inevitable” that some public sector systems will fall through the cracks, Saryu Nayyar, CEO of cybersecurity firm Gurucul, told TechNewsWorld. “Even the best run environments have strays.”
As for the private sector, “It’s likely that some organizations will weigh the organizational costs and delay addressing this directive based on assumed risk or resource concerns,” Nayyar added. Private companies may be forced to patch the Windows Netlogon flaw.
On February 9, 2021, Microsoft will begin to enforce new settings that will improve the security of the Netlogon Remote Protocol, Joe Dibley, security researcher at Stealthbits Technologies, told TechNewsWorld. The flaw will have to be patched first.
“Nearly all organizations have processes and procedures for ensuring their Windows systems received patches in an automated and timely matter, but very few have strategies for any other products in their environment,” Chris Clements, VP of Solutions Architecture with managed security services provider Cerberus Sentinel, told TechNewsWorld. “The state of patching for network appliances is often abhorrent, simply because the responsibility hasn’t been clearly defined.”
That said, corporations “can absolutely be made to take more responsibility for their own cybersecurity,” Mounir Hahad, head of Juniper Threat Labs, told TechNewsWorld.
On the consumer side, users pay lip service to cybersecurity, an online survey of 1,000 people across the U.S. conducted in May by professional network services and accounting firm KPMG found.
About 75 percent of the respondents consider it risky to use the same password for multiple accounts, use pubic WiFi, or save a card to a website or online store, but more than 40 percent do these things, according to the survey.
“Consumers are their own last line of defense when it comes to cybersecurity,” Stealthbits’ Piazza remarked. “Although businesses and governments have a responsibility to protect sensitive data in their possession, ultimately consumers can ensure their digital well-being by following cybersecurity best practices themselves.”
“When new security features are added to a website or software, users are typically only OK with them if they’re not impeded in any way or if they can see an immediate, tangible benefit.
“Most best practices for personal cybersecurity don’t come with strong, immediate motivating factors for consumers unless they look at the big picture,” Piazza said.
The consumer is not to blame, Juniper’s Hahad contends. “Cybersecurity professionals would like to enlist the help of consumers in limiting or mitigating cybersecurity risk, but we cannot hold them responsible for things they do not understand,” he said.
The onus, in his view, is on businesses to ensure cybersecurity for themselves and consumers.
“We would like consumers not to keep default passwords, but we’d rather require companies not to allow default passwords to persist,” Hahad said.
“We can ask consumers to choose stronger passwords, but we’d rather have services refuse a weak password. We can ask consumers not to reuse passwords, but we’d rather have a consortium checking passwords are not being reused across sites or services,” he explained.
One way around this is to implement privacy by design, which is the new normal when designing software, websites, and services, Piazza commented.
“While consumers can’t be legally forced to follow security best practices, government regulations will force organizations to employ better safeguards, which in turn will result in more enforced policies surrounding user password selection, the use of multifactor authentication, and other aspects of the consumer authorization workflow,” he concluded.
Toronto-based telecom company Unite Communications in July launched a subsidiary company called TextMeAnywhere to help retailers manage customer curbside pickup and other customer contact services during COVID-19 and beyond.
The new service uses a proprietary web application that transforms a business landline, VoIP, or toll-free number into a textable number. Its service area now covers all of North America and elsewhere unofficially.
One purpose of the service is to make it easier for customers to communicate with store personnel when arriving for curbside pickup. Patrons can also send text messages to merchants about any other inquiries.
Retailers deemed essential during the pandemic restrictions had an unintended advantage because they were forced to adapt from the beginning and provide curbside pickup as the only option. But now other retailers are beginning to follow, according to Brian Presement, CEO of Unite Communications.
TextMeAnywhere gives merchants without multiple store numbers for voice and texting a way to provide that capability on an existing phone line. It is a business-to-business type service for any company that deals with the public.
Retailers have needed to quickly adapt to the demand for this new messaging option, frequently running into logistics issues when their business phone numbers are landlines. Many had to ask staff to manage multiple phone calls, send emails that get stuck in spam, or purchase mobile phones and numbers so their workers did not have to use their personal devices.
The TextMeAnywhere service turns the business phone number into a textable number and allows staff to easily chat with customers on the web application. The business number is likely already stored in the customer’s phonebook, so they know who is texting them, according to Presement.
“Some 150 million text messages are sent daily and go missing because they are sent to numbers that are not digital lines for texting,” Presement told the E-Commerce Times.
Curbside pickup has become a necessary option to offer customers. In fact, some shoppers will not consider doing business with a retailer or restaurant that does not offer curbside service. The TextMeAnywhere solution helps make the associated logistics to service curbside easier to execute for small businesses.
TextMeAnywhere allows a merchant’s current business number to receive forwarded texts sent by customers, and merchants can reply to those messages.
The backend is similar to what happens with a cell phone. The servers identify the carrier of the received message. The client does nothing different. Set up takes less than 10 minutes.
The process works without merchants needing to set up complicated processes, purchase additional hardware, and advertise additional phone numbers. The text message exchanges happen via a web-based portal.
Only merchants need to use the portal. Customers can send and receive text messages with merchants as they do with regular texts.
It is fast, modern, convenient, and in-demand, according to Presement. The web-based messaging portal eliminates the pressure businesses face in being forced to have a new number for texting.
“Ninety percent of people prefer texting to voice messages. They don’t want to wait on hold,” he said about the new system’s reliability.
The penetration rate for mobile phones in North America is 92 percent. People check their texts much more quickly than they do their email.
“Texting has an average time to open of 90 seconds. An email has an opening delay of 90 minutes. Texting gets a lot more attention,” said Presement.
In addition, SMS messages see a 98 percent open rate, whereas emails only see a 20 percent open rate. Texting is the preferred communication method for most customers, he said.
Right now the TextMeAnywhere service is a one price fits all model at US$15.95/month, noted Presement. That pays for unlimited incoming text messages and 500 outgoing messages. After that, it is a penny per message. The company’s website also lists a $30 activation fee.
Unite Communications has operated for 20 years. Presement spun off the texting service to fill what he hopes will be a waiting market. It is a separate company with its own infrastructure and staff.
“Our biggest challenge is getting the word out. The battle is showing people that this is a service they didn’t know they needed,” he said.
His goal is to enhance the service every quarter. We are working on a feature list. One of the first new features to add will be a keyword auto-response function.
Unite Communications may have picked a good time to venture into this wide-open field, noted Charles King, principal analyst at Pund-IT.
“As Shopify’s recent massive jump in share value indicates, companies outside the Amazon sphere continue to struggle with establishing and managing virtual business transactions. Unite seems to be in the right place at the right time,” he told the E-Commerce Times.
Businesses, especially smaller companies, have a high bar to mount when it comes to effectively addressing and managing the mobile transactions that an increasing number of consumers take for granted, he observed.
“If Unite Communications can supply the services those businesses need, it could become an invaluable ally in the next phase of online and mobile commerce,” King suggested.
While many people have been stuck at home during the pandemic, virtual travel marketing has emerged to help fill the void felt by those who yearn for faraway places.
The E-Commerce Times caught up with some virtual travel experts to discover how they’re using augmented and virtual reality, video and other technologies, to give folks the opportunity to explore the world without leaving their homes.
“As the pandemic has left many stuck indoors, cautious and scared, there is a common need to feel normal,” Lynn Kaniper, president of Dana Communications, told the E-Commerce Times.
“The ability to dream of vacation provides hope and inspiration for the future. Virtual travel marketing provides businesses the advantage of allowing their customers to have an immersive experience by connecting them directly with their products and services,” she said.
In an increasingly closed-down world, virtual travel opportunities have become more important than ever.
“We know from our monthly travel sentiment study that people have not stopped dreaming about vacations during COVID,” Clayton Reid, CEO of MMGY Global, told the E-Commerce Times.
“In fact, being anchored to home has pronounced the amount of time people are spending with the inspiration and shopping elements of travel. Strong virtual experiences have helped people stay connected to what an ultimate trip will offer and helped to keep long-term leisure travel intent high,” he noted.
Virtual travel marketing can offer a close-to-reality experience, inspiring people to learn about destinations, dream about journeys, and eventually book trips.
“Video is one of the top marketing tools that give the ability to experience through sights and sounds,” explained Kaniper. “Nowadays, 360 tours of destinations or properties allow the audience to experience and set expectations for their travels. Social media has been a way for properties, destinations and travel providers to communicate with their audiences in real time.”
Virtual travel experiences are beginning to imitate the immediacy and thrill of actual travel, and that kind of authentic experience is important for the success of these new marketing techniques.
“Virtual travel marketing has become more and more engaging,” Kaniper continued. “You can have a virtual experience flying in a plane, skydiving, riding a rollercoaster, or simply taking a tour of a resort or hotel. Virtual travel continues to evolve, as people look to have more experiences and to places that they may never get to experience, such as the moon or Mars.”
For a virtual travel marketing campaign to be successful, it needs to engage people as much or more than would an actual trip. When it’s effective, this kind of marketing can inspire people to plan and book a trip, now or in the future.
“It’s all about the ‘wow’ factor,” John Graham, president of Travel World VR, told the E-Commerce Times. “It has to be exciting and energetic. 360/VR videos need to have an even flow to keep the audience engaged, wanting to learn and see more. It will also lead to increased travel bookings and shorten the sales cycle dramatically.”
Virtual travel marketing can be particularly successful when it highlights its many benefits it offers for consumers.
“Today virtual is more important than ever,” Kaniper asserted. “Parents are homeschooling and want ways for exploration for their children. Those with wanderlust can have experiences, but in a safe comfortable way.It is also an eco-friendly solution to over-tourism.
“Whether it’s vacations, cruises, or meetings, virtual gives the ability to plan and know before you go or the ability to get away from it all. Those who aren’t able to travel can now go anywhere in the world, making the bucket list more attainable than ever.”
It’s also important for consumers to see and understand the relevance of virtual travel in their daily lives.
“Virtual and augmented reality programs for theme parks such as Cedar Fair which features ‘immersive living stories,’ and destinations such as Berlin, where visitors can experience the city in advance of their actual trip, allow a new expression of what travel can now represent,” said MMGY’s Reid. “Tying specific video or animated content to a VR/AR experience brings travel brands to life in new and special ways and allows more relevance for travelers.”
Virtual travel is likely here to stay, indeed long after the pandemic. It offers marketers the chance to reach a wide range of customers, and it gives people the chance to travel from the comfort of their own homes.
“VR travel is becoming more mainstream each day,” said Travel World VR’s Graham. “It’s evolving into the ultimate marketing tool for all categories of travel suppliers, travel sellers and their clients, the consumer. It’s changing how travel is being viewed — not only now, but in the future.”
Though spurred on by the crisis of the pandemic, virtual travel marketing is increasingly becoming the norm.
“When the pandemic hit, everyone was scrambling,” Adam Stoker, president and CEO of Relic, told the E-Commerce Times. “How do you market a destination when people are unable to travel? It became immediately apparent to everyone in the tourism industry that while people are unable to travel now, the competition will be fierce when travel opens up again.
“So many destinations decided to figure out ways to allow people who were cooped up in their homes the opportunity to virtually experience the destination. The theory is that if people can see from their homes how amazing a destination is, they will be more likely to book a physical trip to the destination when it’s safe.”
Many of the techniques being used by virtual travel marketers during the pandemic will, in fact, likely permanently transform the world of travel marketing.
“This phenomenon of virtual travel marketing may have appeared in the pandemic, but I see this as being a kickstart to a new wave of creative execution in the industry,” explained Relic’s Stoker.
“The success destinations see as a result of this marketing strategy during the pandemic will lead to continuous virtual content creation as time goes on. It’s something the industry has needed to invest in for a long time. The pandemic was just an accelerator,” he suggested.
Many people have worked at less than full speed over the last seven months or more, but some people managed to get more done than others. A raft of new technologies that impact CRM are about to be announced, but even without the latest announcements due now through October, there’s a realization that we’re coming full circle. Things we thought about and argued over decades ago are back, in different forms, with new solutions.
Oracle is making hardware sexy again, using it to drive new business models and to push its CX version of CRM, and Salesforce is tackling a kind of mass customization of its CRM product line.
Hardware has always been important, but once small computers could be networked with larger servers, much of the angst associated with running large groups of users melted away.
We no longer think very hard about how many kilobytes of memory a user needs, even on handheld devices that have gigabytes available. Performance isn’t much of an issue unless you are running many, many users banging on an inference engine. Even if that’s an issue for you, throwing hardware at the problem works better than ever.
But more iron isn’t as important as which iron. Today’s marginal apps are compute-hungry because they crunch data to feed inference engines and algorithms that assist in real-time decision-making, part of what’s called HPC or high-performance computing.
The important wrinkle in all this is how to deliver HPC from the cloud — and the answer starts with the GPU or Graphic Processing Unit, today’s equivalent of the 80287 math chip. Modern graphics are rendered through math calculations, and the GPU crunches the numbers. Naturally, another use for the GPU is all of the probability processing needed to deliver recommendations to customer-facing employees.
This week, Oracle announced the general availability of Nvidia A100 support in the Oracle Cloud. This support provides performance comparable to Nvidia’s DGX workstation but through the cloud, which Oracle says is “A great alternative for customers that need the absolute performance for their workloads such as deep learning training or hardware accelerated visualizations.”
Translation: You don’t need to host your compute-intensive apps down the hall anymore. The cloud can handle that, but at this point it’s good to say run your own benchmarks.
This and the rest of Oracle’s hardware announcements focus on bringing multiple compute resources to users rather than the single ones available back in the minicomputer era. But this also brings a good deal of memory — up to 2TB in some cases — because there’s nothing like having data available to be crunched in nanoseconds instead of waiting milliseconds to get the data from disk. These were all issues we obsessed over decades ago, and they’re still issues, but we have many more tools.
The practical impact at the business model level is to remove a layer of excuses for not migrating even large corporate workloads to the heavens.
Another issue from the way back machine is vertical market CRM solutions. There’s always been a tension between the idea of selling generic CRM and that of delivering solutions highly customized to the needs of an industry’s best practices. In other words, how to mass customize CRM.
The target has moved primarily because software platforms make it easier than ever to build a generic function that can be customized to vertical market needs. David Schmaier, co-founder of Vlocity, a company that Salesforce paid US$1.3 billion for recently, has this down. However, he sees a need for fewer than the 24 industry apps he oversaw at Siebel.
Vlocity specializes in vertical market CRM apps for a handful of large industries. Now, with Salesforce’s resources behind it, Vlocity should be able to do more, especially since Salesforce already had an industry orientation. Look for more announcements on hardware acceleration and industry solutions in the weeks ahead.
All of this comes along at an auspicious time. There’s already a need for better apps and more compute power that can be delivered to today’s home warriors. Propitious announcements like these arrive at this time of year, like ripe apples, in time for show season. There will only be virtual shows this year; Dreamforce and OpenWorld could turn the Moscone Center into a super-spreading virus incubator if the companies tried to go forward — and who wants that?
But these announcements and others like them hit the mark for what’s needed now. Over the last few months, I conducted two surveys of over one thousand customer-facing employees to learn about their daily routines and their use of technology. To my surprise, the systems most of them use are antiques.
Despite the big investments in CRM we’ve really only scratched the surface. Systems bought even in the last few years are woefully inadequate for today’s demand. One item from one survey especially sticks in my mind: CRM was fourth from the top of the list of apps that people rely on to do customer-facing jobs behind email and social media.
Seriously?
Worse, customer-facing reps would do much to avoid updating their CRM systems and do. So it looks like a new generation of technology to support users can’t come soon enough. Advances in things like GPU processors and industry-specific CRM will add muscle to our day-to-day work effort. Who says there’s no good news around?
According to gaming experts, 2021 is shaping up to be a big year for cloud gaming.
“We are expecting a big jump in revenue from 2020 to 2021,” observed George Jijiashvili, a senior analyst with Omdia and author of a report on cloud gaming released last week.
“We think consumer use of cloud gaming will reach US$4 billion, which is a growth rate of 188 percent, a massive jump from 2020,” he told TechNewsWorld.
His report also predicted cloud gaming revenues would reach $12 billion by 2025.
A proliferation of cloud gaming services will be launched in 2021, which should brighten the sector’s revenue picture.
“There are currently 25 cloud gaming services currently in beta globally,” noted Piers Harding-Rolls, a games industry analyst with Ampere Analysis.
“Some of those will commercially launch in 2021 and drive awareness, adoption, and monetization,” he told TechNewsWorld.
More significantly,” he continued. “Xbox has added a cloud gaming feature to Xbox Game Pass Ultimate, which will, if included in the market sizing, add value to the opportunity.”
“I also expect Sony’s service PS Now to continue growing in 2021,” he added.
Mark N. Vena, a senior analyst with Moor Insights & Strategy, noted that despite some less-than-spectacular performances from Google’s Stadia service, the overall cloud gaming category will continue to rise in 2021 and beyond.
“Microsoft and Sony — to a lesser extent — will grow the market with their forays into the cloud gaming arena with their new console launches that will occur in the holiday season, and Google will widen their library of games,” he told TechNewsWorld.
“The pandemic will also continue to be a tailwind for gaming in general,” he added.
Microsoft will be a big driver of cloud gaming revenue in 2021, according to analysts interviewed by TechNewsWorld.
“There are 15 million Xbox Game Pass subscribers,” Jijiashvili explained. “Not all of them will be Ultimate subscribers, but a big chunk of them will. So next year, there will be several million cloud gaming players just from Xbox alone.”
Kristen Hanich, an analyst with Parks Associates, noted that Microsoft has had some aggressive Game Pass promotions with the goal of bringing more attention and trial users to the offering.
“Once customers understand the value proposition of this model, many will stay and pay full price,” she told TechNewsWorld. “It should be fairly easy to upsell these subscribers to Game Pass Ultimate, the cloud gaming tier.”
Microsoft’s termination of the 12-month Xbox Live Gold Pass could be used as part of that upsell.
Michael Inouye, a principal analyst with ABI Research, explained that a monthly Xbox Live Gold membership is $9.99 a month or $24.99 a quarter, which amounts to $8.33 a month. Ultimate Game Pass is $14.99 a month, which includes access to select PC, console, and cloud titles, cloud gaming, and Xbox Live Gold.
“In effect, you are getting everything beyond Gold for $5 per month, assuming you went with the monthly Gold plan,” Inouye told TechNewsWorld.
One reason growth numbers for the sector may be large is because of their starting point.
“Revenues are pretty much non-existent now,” said David Cole, an analyst with DFC Intelligence.
“You’re really looking at who is positioning themselves to be a leader going into 2021,” he told TechNewsWorld.
“Microsoft’s strategy — to bundle cloud gaming into all their other content — is where cloud gaming is going,” he continued. “It doesn’t exist as its own service, but as part of a larger subscription.”
Attractive pricing, though, won’t deliver on the big revenue potential of cloud gaming without appealing content.
“At the end of the day, gamers are willing to pay for great content,” observed Lewis Ward, a research director for gaming at IDC.
“So the main driver, in my view, will be an increasing number of excellent games in these services,” he told TechNewsWorld.
“If you haven’t got the content that people want to play, there will be no uptake of cloud gaming,” Jijiashvili added.
James Moar, a research analyst with Juniper Research, explained that, as with video streaming, cloud gaming will rely on having appealing content libraries.
“If the latest titles can be played at high quality without the need for new hardware, then cloud gaming will see strong revenue growth in developed markets like North America and Europe,” he told TechNewsWorld.
Convenience will also attract players to cloud gaming, added Parks’ Hanich.
“Gamers want to play their games on multiple devices,” she explained. “That is especially true of gaming enthusiasts who game for many hours per week. They game on their PCs and consoles, but also on their mobile devices when they’re on the go.”
When markets grow, it can be at the expense of others in the market. That won’t be the case for cloud gaming, at least for the next five years.
“Cloud gaming will contribute to the overall growth of gaming,” Jijiashvili maintained. “Cloud gaming will be a ‘nice to have’ addition to a player’s existing gaming, either on console, PC or mobile.”
Ross Rubin, the principal analyst at Reticle Research, explained that in 2021, cloud gaming will be used by players to extend their reach. “It’s about playing games that you purchased for other platforms,” he told TechNewsWorld.
“It’s a year where more consumers will get exposed to cloud gaming and see how well it works,” he continued. “It may set the stage for shifting more of their gaming to the cloud in the coming years.”
Hanich added that cloud gaming might add to revenues from game sales of game developers.
“Participating in a gaming subscription catalog may actually boost a title’s revenue since it’ll get the game in front of more players who may not have purchased it on their own,” she said.
“Plus,” she continued, “with what we’ve heard from Microsoft, many times people who play a game on Game Pass will also go and buy that game later so they won’t lose access to it if it’s rotated out of the Game Pass catalog.”
Social Media
See all Social Media