Welcome!

Industrial IoT Authors: Pat Romanski, William Schmarzo, Elizabeth White, Stackify Blog, Yeshim Deniz

Related Topics: Industrial IoT

Industrial IoT: Article

Integration Is the Killer App

Integration Is the Killer App

In 1975 Niklaus Wirth, the Swiss computer scientist who created the Pascal programming language, published a seminal book entitled Algorithms + Data Structures = Programs. If Wirth had written about business applications, Computing + Storage = Applications would have been a better title. Of course, in 1975 there weren't that many business applications. Most of them ran limited back-office functions on mainframes. PCs weren't on the map. The killer app for PCs - the first spreadsheet - wouldn't be created until 1981.

More than a quarter of a century later, things have changed. Most notably, applications are connected. TCP/IP and Ethernet provide a base set of networking standards. Distributed computing standards, XML and Web services being the latest ones, enable cross-application communication. A modern rendition of Wirth's statement could be Computing + Storage + Connectivity + Open Standards = Applications. In a world of connected applications, integration is the killer app. It is the key to growing the business value of applications. To really understand the significance of this claim, it helps to put the future in perspective by looking at the broader trends in computing, storage, connectivity, and open standards.

Trend Watching
Moore's Law - named after Gordon Moore, cofounder of Intel - has defined computing trends since 1965. Since its last revision in 1975, it states that the number of transistors per square inch on integrated circuits doubles every 18 months. Moore's Law is expected to hold for at least a decade longer. Processors, memory, and other circuitry will get both faster and smaller. Computers will crunch through more data. Mobile devices will have significant independent computing capabilities, enough to run basic office applications. Embedded microprocessors, already present in everything from smart cards to household appliances, will start appearing in ever more places. And, just as Moore's Law is starting to give way, quantum computing may be within reach.

Storage also exhibits powerful trends. Storage density has increased by nearly 50% per year. Total storage capacity has been increasing much faster - more than doubling every year. Costs are falling rapidly. Disk storage is at about the cost of tape storage - around $1/Gb. You can buy a terabyte drive for less than $1,000. At that rate, it will soon be possible to keep all data online all the time. Portable devices are also increasing their storage capabilities dramatically. CompactFlash cards of 1Gb are readily available. In the same form factor, Hitachi Microdrives provide several gigabytes of storage. For smart phones there are MultiMediaCards. With comparable capacities, they are smaller, more secure, and more efficient. In the future, we have new types of optical memory, for example, holographic, to look forward to.

Connectivity hasn't exhibited the predictable trends of computing and storage. Instead, it has exploded. The Internet boom was marked by massive investment in capacity. Dark fiber is still waiting to be used while telcos are going bankrupt to restructure their expansion debt. Most businesses and many households have broadband connectivity. Broadband capabilities are increasing. New technologies allow gigabit traffic through existing cable lines. Still, the progress in wired connectivity pales in comparison to the great strides in wireless broadband.

Businesses and homes are embracing untethered computing at increasing rates. The suite of 802.11 standards already offers secure bandwidth up to 56Mbps. 3G may be forever behind schedule, but wireless providers are deploying 2.5G and 802.11 hotspots are popping up in many areas. Two-way satellite offers decent connectivity even in remote locations. The FCC recently unlicensed spectrum around 60Ghz, which allows point-to-point wireless connectivity at gigabit speeds. Mesh networking eliminates the signal strength issues that surround point-multipoint approaches (such as 3G and satellite). At the other end of the wireless spectrum, connectivity is about lowering costs. Very low-cost self-organizing networks seamlessly connect devices such as industrial sensors without wires and without configuration. Further down on the cost scale (at mere cents) are RFID tags that will allow, for example, FedEx packages to be tracked automatically as they enter and leave a distribution center.

Computing, storage, and connectivity are meaningless if they cannot be used together to build and connect applications. That's why we need standards. TCP/IP and Ethernet provide the base of networking standards on top of which we run naming services through DNS and DHCP, the Web through HTTP, e-mail through POP/IMAP, and many more. For application development the industry is converging to two runtime platforms: Java/J2EE and Microsoft's .NET. However, the biggest strides on the standards front have recently been made with respect to integrating applications. XML is steadily becoming the workhorse of data definition and representation. Soon, we won't be hearing much about XML. This will be a good thing, the final indicator that XML has found its proper place in the stack of commonly used technologies. Nobody talks about ASCII anymore; it's taken for granted. The same will happen with XML. Web services have some way to go to get there; there is still too much overpromising and underdelivering.

Despite the hype, one thing remains true: Web services are our current best hope for an open, flexible, and comprehensive set of standards covering distributed computing and application integration. First, all the major platform vendors are behind this movement. Second, the standards are designed in a way that supports the distributed evolution of related specifications by different vendors and different standards organizations. This industry process, although it is not a pure "one company, one vote" democracy, is a significant improvement compared to the traditional extremes of authoritarianism (DCOM) or design-by-consensus (CORBA).

There you have it. The future will offer much computing power and storage spread along an ever-increasing number of devices. The numbers of units of every new generation of computing devices have dwarfed previous numbers: from mainframes to PCs to handhelds and phones, and now all things with capable microprocessors. Plenty of connectivity and a solid base of standards will enable many distributed applications. New grid architectures will provide abstractions that focus on what services are needed as opposed to where they are located and how they are accessed.

If we look at the trends, the future is bright. All this innovation embodied in connected distributed applications running everywhere - wow! What's wrong with this picture? There is a key ingredient missing - integration. The future can bring great value, but integration is the key to unlocking the potential of this future world.

Metcalfe's Law Redux
To understand the role of integration, it helps examine one of the pillars of the Internet, the network effect argument. It goes something like this: all computers connected to the Internet are going to leverage one another's capabilities and create enormous value. The proponents of this argument found support in a digital "law" proposed by Bob Metcalfe, inventor of the Ethernet and founder of 3Com, circa 1980. Most people believe that Metcalfe's Law states that "the value of a network is equal to square of the number of its users." Value is expressed as a monetary equivalent - some multiple of U.S. dollars, for example. By that argument, when the Internet had 100M users, its value was equal to some constant multiple of $1016. Surely, the multiple cannot be too big since that amount far exceeds the overall output of the world economy.

Bob Metcalfe is a partner in my firm and he gave me a copy of the original slide describing Metcalfe's Law (see Figure 1). In its original form, the statement was, "The systemic value of compatibly communicating devices grows as the square of their number." The law is based on the mathematical truth that there are N(N - 1) possible point-to-point connections between N computers on a network. Value is derived through "compatible communication," a meaningful information exchange between the machines. The law assumes that all connections carry equal value. While not true in the real world, this simplifying assumption does not change the fundamental insight of Metcalfe's Law. Metcalfe's Law defines a theoretical maximum for the value of a network. Because value is derived through the leverage of meaningful connections between machines, in a network where only a few machines are regularly exchanging information, the actual network value will be significantly lower than the potential allowed by Metcalfe's Law.

The main difference between what Bob Metcalfe said and how it is remembered has to do with how the size of the network is measured - devices versus users. This turns out to be a crucial difference. Machines can easily scale the number of connections or conversations they have with other machines. They are good at processing a lot of information. Humans cannot do that. The brain is not designed for it. For example, while my computer at home can access hundreds of Web sites in just a matter of minutes, I, its user, could neither handle the fast context switching nor process the amount of information presented. Humans get overloaded with information fairly quickly. In the digital domain, people use server-based applications that aggregate and integrate information. Search engines and portals on the Web are good examples as are integrated information systems in corporations. As the number of sources and amounts of information increase, people need more help using this information. As the size of networks goes up, servers and the connections between them - not the number of users of the network - offer the most potential for exponentially increasing the value of networks. And the key to that potential is integration.

Analyzing Network Value
A simple network model demonstrates the value-creating potential of integrated servers. Assume that in a network of N machines, s denotes the percentage of servers each running one application. The total number of servers is sN. On average, these servers are integrated with a percentage of the rest of the servers, i. The total number of server-to-server connections is i(sN)2. The clients for these servers are desktop computers. Clients cannot communicate directly between themselves. They can only communicate with servers. On average, each client communicates with k servers. The total number of client-to-server connections is k(1-s)N. Combined, the client-to-server and server-to-server connections yield the total number of connections for the network.

For example, let's choose a network with 90% clients and 10% servers. The integration ratio between servers is 10% and clients can communicate with 10 servers on average. Figure 2 shows how the value of this network compares to the ideal value allowed by Metcalfe's Law for networks between 10 and 100,000 nodes. Both axes use logarithmic scale to help with comparisons for lower values of N. As the network grows, its value is mostly generated by connections between servers - i(sN)2 - because the connections scale proportionately with the square of the number of servers. Figure 3 demonstrates that after about 10,000 nodes server connections contribute more value than client connections whose value only grows linearly with the number of clients.

This insight explains the difference between large and small networks. For small networks, the value is dominated by clients. However, in very large corporate networks or networks the size of the Internet, the mere addition of more users does not increase the value of the network more than linearly. On the other hand, the addition of meaningful applications that (1) leverage the value of many other applications and (2) can be accessed by all users has significantly more impact. This makes sense when applied to the Web. Do you personally get more value out of the few thousand servers that run in the Google data centers or a million broadband-enabled households in Korea?

Metcalfe's Law holds for the example network because server connections grow as the square of the number of network nodes. However, the value of the network is far lower than the potential allowed by Metcalfe's Law. As N grows, the ratio between actual value and potential value is equal to is2. For the example network this equals 0.1% or 1/1,000 of the true value potential (see Figure 4). Value grows as the square of the percentage of servers on the network. Also, the integration ratio, i, is directly related to the value-generating potential of networks. The more servers are integrated together the more value will be generated in the network.

The Killer App
This simple network model may not apply to networks the size of the Internet. If N is 100M, then 10M servers will have to each communicate with 1M servers. Network boundaries, computational limits, and application as well as integration complexity make this impossible in the general case. If there is a limit to the number of meaningful connections that any one server can make, then Metcalfe's Law will no longer apply to the network. Value will still increase but at a linear, not quadratic, rate.

Even in that case, however, integration remains the best hope for increasing the value of the network beyond just increasing its size. Integration can happen on a larger scale than any user-centric information processing. It is about meaningful connections between network nodes and thus leverages the core insight of Metcalfe's Law. That is why integration is the killer app (in a meta-sense) in a world with an ever-increasing number of applications running on ever larger networks.

The need for better integrated applications is huge. Here are just a few examples from the public Internet and corporate networks:

  • Have you ever forgotten some of your passwords? If single sign-on is broadly deployed you will not have to remember so many passwords.
  • Have you ever chosen not to buy from a Web site offering you the best terms because you didn't have an account set up with them and you didn't want to spend the time registering? If eWallets work broadly on the Web this will not be a problem.
  • E-mail is the lifeblood of companies, and much relevant business information is sent and received through e-mail, often outside the context of the enterprise systems. A whole industry has sprung to address this lack of integration across multiple domains: sales automation, collaboration, resource planning, and so on.
  • One large financial institution takes more than a week to process a change of address. The institution thinks that they can make up to 10% more per customer if only they could process the change in less than 24 hours. They have been trying for years, without success, to integrate several applications to do this.
  • Forrester Research estimates that more than 50% of Fortune 1000 enterprises have more than 50 legacy or packaged applications that need to be integrated - in spite of the functional and business needs that require them to be. Given the simple example of the financial institution above, it would seem that the opportunity cost of not integrating these applications is huge.
  • CIO magazine's tech poll recently showed that 87.5% of surveyed companies have an IT application backlog, weighed heavily toward integrating existing systems as opposed to buying new systems.

    End users as well as IT and business executives want more integration and information leverage between the applications they use every day. More integration is needed everywhere. The integration market is huge. IDC estimates that in 2002, the size of the application integration software market in the U.S. was approaching $5B. At the same time, the U.S. systems integration market was about $38B. Most of that is pure labor cost - as opposed to hardware and software cost - and much of it is spent on integrating existing systems as opposed to building new systems. It's difficult to estimate how much money is spent in-house on application integration. It's certainly likely to be a significant portion of large corporations' IT budgets. Two things become clear from the analysis. First, the total market size ends up being in the tens of billions for the U.S. alone. Second, labor accounts for the majority of that amount.

    The Service Constraint
    To harness the value potential of applications, we need a lot more integration. The only way to achieve this is to significantly lower the cost of integration across the board. There is a problem, however. The low product-to-service ratio in the integration space puts a constraint on both the rate of growth of the market and the cost of integration projects. Screen scraping, data mapping, and building bridges between incompatible APIs are not scalable activities. They require trained personnel and a lot of time. These activities simply cannot keep pace with the trends described in the beginning of this article. Also, given the high service component of integration projects, even big decreases in the costs of integration products cannot have a significant impact on the total cost of integration projects.

    When technology really picks up its pace, IT services cannot keep up. Think back to the days of the Internet explosion. It became nearly impossible to hire qualified people in 1998-1999, either full time or as consultants. Salaries were increasing while projects were failing in the hands of unskilled programmers and business analysts. During a technology recession the pressure is released, but this temporary downward shift in demand cannot change the basic fact that skilled labor is scarce. Unless enterprises fundamentally change their dependence on integration services they will not be able to leverage the powerful trends in computing, storage, connectivity, and standardization. Nor will they be able to leverage the full value potential of integrated distributed applications. Companies will be stuck with integration backlogs that forever exceed their resources, both monetary and human.

    It's important to understand that XML and Web services cannot significantly increase the product-to-service ratio by themselves. Standards in that area do help eliminate low-level tasks such as forming messages in proprietary binary protocols and programming data transformations in low-level languages. However, XML and Web services are just tools, not end goals in themselves. If two companies use different schemas to describe what a customer is and have different Web services APIs for manipulating customers in their CRM systems, then somebody has to spend time to define the data transformation between the data types and to develop a bridge between the applications. This is high-level work that requires trained professionals. Standards will never, ever bring homogeneity in the way enterprises view information and in the way software vendors build applications. Competitive pressures always demand differentiation, which leads to incompatibilities.

    The best approach is building better systems that significantly lower the planning, design, development, testing, and operating costs of integration. There is no single silver bullet. The efforts need to combine product development with standards work and partnerships between companies with best practices. People often think of integration as boring business. Nothing could be farther from the truth. It is a huge and fast-changing market with a desperate need for more innovation.

    In a follow-up article I will describe in more detail the areas that need attention and some of the promising technologies that can help. Also, I'll point out how some visionary companies, such as Microsoft, deeply understand that integration is the killer app of the future and are making long-term strategic moves to lower the cost of integration across the board.

  • More Stories By Simeon Simeonov

    Simeon Simeonov is CEO of FastIgnite, where he invests in and advises startups. He was chief architect or CTO at companies such as Allaire, Macromedia, Better Advertising and Thing Labs. He blogs at blog.simeonov.com, tweets as @simeons and lives in the Greater Boston area with his wife, son and an adopted dog named Tye.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    IoT & Smart Cities Stories
    There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
    Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the mod...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Druva is the global leader in Cloud Data Protection and Management, delivering the industry's first data management-as-a-service solution that aggregates data from endpoints, servers and cloud applications and leverages the public cloud to offer a single pane of glass to enable data protection, governance and intelligence-dramatically increasing the availability and visibility of business critical information, while reducing the risk, cost and complexity of managing and protecting it. Druva's...
    BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.
    The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
    With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
    DSR is a supplier of project management, consultancy services and IT solutions that increase effectiveness of a company's operations in the production sector. The company combines in-depth knowledge of international companies with expert knowledge utilising IT tools that support manufacturing and distribution processes. DSR ensures optimization and integration of internal processes which is necessary for companies to grow rapidly. The rapid growth is possible thanks, to specialized services an...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Scala Hosting is trusted by 50 000 customers from 120 countries and hosting 700 000+ websites. The company has local presence in the United States and Europe and runs an internal R&D department which focuses on changing the status quo in the web hosting industry. Imagine every website owner running their online business on a fully managed cloud VPS platform at an affordable price that's very close to the price of shared hosting. The efforts of the R&D department in the last 3 years made that pos...