Trends

OPINION

IBM’s Innovation Strategy: Preserving the Foundation

By any measure, IBM Connect, the annual IT analyst summit hosted by the company’s Software Group (SWG), should have been a walk in the park. After all, the event followed close on the heels of IBM’s System and Technology (STG) analyst briefing, and the central purpose of both events was to punctuate the company’s 2009 decision to unite SWG and STG under a single banner and leader — SVP and Group Executive Steve Mills.

Rather than simply fill in a paint-by-numbers canvas, however, IBM Connect added significant depth to the company’s once bifurcated, now conjoined, software/systems strategy.

The event started with a keynote presentation by Mills, a bright and articulate guy — able to hold forth with easy self-assurance on virtually any topic that touches the SWG/STG bailiwick. Though his subject was “Smarter Software for a Smarter Planet,” Mills started by tackling the SWG/STG unification, suggesting that the “only change” resulting from the move was his job title. That tongue-in-cheek point served to emphasize the fact that deep software/hardware integration is anything but a new subject at IBM.

It’s the Software

In fact, Mills noted that software “secret sauce” has long been a crucial element in the performance leadership of IBM’s System z mainframe and Power systems, and also plays a critical role in the company’s System x x86/64 servers and storage solutions.

Analysts and the market should anticipate additional, deep product integrations in the future, he said, as IBM leverages software to enhance performance across systems, down to and “within the footprint of the microprocessor.” This is an interesting point of differentiation, since among Tier 1 IT vendors, only IBM and Oracle/Sun continue to develop their own native microprocessor architectures.

In fact, Oracle provided some subtext to Mills’ keynote, not surprising given its aggressive competitive positioning and system performance claims vis vis IBM. Oracle and its CEO Larry Ellison were the targets of salty comments from Mills, but he made them in a broader discussion of computing customers’ needs.

Since enterprises across the globe leverage millions to billions of often highly customized business applications, Mills noted, the IT “world we live in isn’t about the ‘next new thing’ but about how well new things can integrate with established applications and processes.”

The key for vendors, Mills suggested, is to “be thoughtful — don’t damage or destroy the [IT] ecosystems customers have taken years or decades to build.”

Notable Breakouts

Like virtually any IBM event, it was impossible for a single analyst to cover in detail the available sessions or topics at Connect. One that I found particularly valuable focused on IBM’s Netezza solutions, cohosted by Arvind Krishna, GM of IBM’s Information Management Software, and Jim Baum, CEO of Netezza.

IBM acquired Netezza in November, citing the value of the company’s data warehouse and business analytics appliances. But Netezza’s concept of “appliances” is more majestic than some might be used to; the company’s flagship TwinFin solutions scale to support petabytes of data, and deliver 10 to 100 times the performance of conventional systems.

New upgrades and future solutions will improve performance even further, in large part via integrated software enhancements, thus highlighting IBM’s “secret sauce” strategy.

Netezza also underlines the broader value of the SWG/STG integration. After managing pilot projects with IBM, HP and Sun two years ago, Netezza decided to standardize on IBM’s BladeCenter architecture and worked closely with the company’s System x organization.

Why did Netezza choose IBM? Because of the System x team’s “sensitivity to [our] goals and their willingness to make it work,” a decision which helped to inspire the acquisition, according to Baum. Overall, the IBM/Netezza relationship defines the extent and potential opportunities of what might be called IBM’s taste and talent for “creative collaboration.”

IBM SWG/STG – Looking Ahead

Overall, Mills and company did a solid job explicating the value of the IBM SWG/STG integration at Connect, particularly in aligning the strategy with customers’ current and upcoming business requirements.

The event also provided deep insight into IBM’s progress and position in a systems market where virtually every major vendor is trying to leverage software to define and enhance the value of hardware solutions. But mere pursuit does not suggest that every vendor is succeeding equally.

Consider Oracle, which seemed much on the minds of IBM’s executives — not surprising, given the ludicrous sideshow of the company’s recent litigation against SAP.

The day after IBM Connect wrapped, Oracle announced a record-breaking TPC-C benchmark for Oracle 11g, in which a SPARC Supercluster delivered almost three times the database performance of IBM’s recent (August 17, 2010) leading TPC-C result, with DB2 running on a POWER7-based server cluster.

While a near-3X improvement in TPC-C is certainly impressive, what did it take to get there? According to the Transaction Processing Performance Council, it took the following:

  • For Oracle: a cluster of 27 SPARC T3-4 servers with 64 cores each (or a total of 108 chips, 1728 cores and 13824 threads) = 30,249,688 tpmC.
  • For IBM: a cluster with three Power 780 servers with 64 cores each (or a total of 24 chips, 192 cores and 768 threads) = 10,366,254 tpmC.

Bottom line: Achieving that 3X performance required Oracle to use nine times as many servers, more than four times as many chips, nine times as many cores, and 18 times as many threads as IBM.

Oracle’s hardware-heavy announcement also sounded distinctly old school for a company trying to rapidly transform itself from a “new kid in town” systems vendor wannabe to varsity quarterback.

Marketing Dj Vu

In fact, it was oddly reminiscent of long gone and largely forgotten Sun publicity efforts — particularly the 2001 promotion of its Sun Fire 15K “Starcat” servers. Despite Sun’s boastful positioning of the Sun Fire 15K as an alternative to IBM’s System z mainframes, the massive (up to 106 processors) servers never made significant inroads in IBM accounts or the broader market.

However, judging from the tenor of Oracle’s TPC-C self-promotion, the Starcat marketing team is apparently still alive and well.

So, what does this say about IBM and its blended Software/Systems organization? Simply this — that as IT evolution continues and vendors and their customers shift increasingly toward commodity hardware, software will drive competitive differentiation and business value more and more.

IBM understands this on an elemental level, and is moving toward a future when software assets provide unique features for, and enhance the development and performance of new company systems and solutions from the microprocessor up.

That demonstrates an impressive level of technological expertise, but it doesn’t stop there. IBM’s strategic vision is implicit in Steve Mills’ comment: “The world we live in isn’t about the ‘next new thing’ but about how well new things can integrate with established applications and processes.”

That idea — that building the future requires careful preservation and stewardship of past foundational efforts — is clearly and completely understood by businesses of every stripe and true systems vendors like IBM.

Ironically, vendors and IT customers who ignore this concept to embrace the ever-changing “next new thing” risk being rightfully left behind and forgotten.


E-Commerce Times columnist Charles King is principal analyst for Pund-IT, an IT industry consultancy that emphasizes understanding technology and product evolution, and interpreting the effects these changes will have on business customers and the greater IT marketplace.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

What's your outlook for the business climate in 2025?
Loading ... Loading ...

E-Commerce Times Channels