Welcome!

Industrial IoT Authors: William Schmarzo, Elizabeth White, Stackify Blog, Yeshim Deniz, SmartBear Blog

Related Topics: Government Cloud, Industrial IoT, Microservices Expo, Cognitive Computing , Agile Computing, Cloud Security

Government Cloud: Blog Post

Estimating the Hidden Costs of Cost Estimation

Federal agencies are not properly equipped to estimate their future IT infrastructure costs

A recent Government Accountability Office (GAO) report found that most federal agencies, with the exception of the Department of Defense, are not properly equipped to give accurate cost estimations of their IT infrastructure. There are many reasons for this, but the problem starts with the data that is being fed into most cost estimation practices and models.

For any organization, federal or commercial, the ability to credibly estimate the time and budget required for a project to reach a successful conclusion is crucial. The many benefits of good estimating have been explained over and over.

According to the GAO, federal agencies are not setting a good precedent in estimating their IT projects.  Most federal agencies have weak processes that rely on expert opinion, while some employ tools such as parametric models. At the root of any process, whether parametric or expert opinion, agencies need access to information about the systems they are supporting or seeking to develop - and this is precisely when the process begins to break down.

Incomplete, Bad and Unattainable Data
Collecting data is not cheap, and it takes time and effort to do it properly. When the budget is tight, data often gets cut from programs. As a result, agencies have an incomplete view of their systems. If they do have data, it is often ‘dirty,' meaning that poor time keeping or project tracking practices generated data that is effectively meaningless. In many cases, system integrators that are performing the work have the data, but agencies don't have access to it.

So, in lieu of data, agencies rely on expert opinion to provide basic inputs into their estimating process. But depending on the day your expert is having, they'll give you some stats about the applications (or not) and send you on your way. But how can you be sure that data is reliable?

The end result is you can't. By front-loading your estimation process or model with uncertain data, any result that comes out will be unreliable ... garbage in, garbage out.

Shrinking the Cone of Uncertainty
Federal organizations would benefit greatly from automated software analysis and measurement systems that generate unbiased metrics of their applications. The value of injecting fact-based measures into the front end of an estimating process greatly reduces the Cone of Uncertainty.

The Cone of Uncertainty describes the evolution of uncertainty in a project. In the beginning, when little is known, estimates are subject to large uncertainty. As more information is learned, the uncertainty decreases.

By injecting an accurate calculation of a system's size, we greatly reduce the amount of uncertainty. And supporting size data with measures of that system's technical and functional complexity, and an objective assessment of its underlying structural quality, reduces uncertainty even more. An estimate that has little uncertainty or a high degree of confidence is the foundation of accurately predicting development teams' productivity. This is important because planning and budgeting are merely exercises to determine how to allocate resources and plan when new capabilities will be available to your clients. How many developers will I need? How long will I need them? When will they be finished?

There are several sources (Standish's Chaos Reports) that document the IT industry's legacy of poor delivery. And there are many reasons why IT projects continue to fail.

We know that most IT budgets, both federal and commercial, are spent maintaining and supporting existing systems. It is clear that agencies that own these existing systems suffer from a lack of visibility into their complexity. Without this information, any planning and budgeting is handicapped. IT intensive programs that require the most planning to deliver systems on-time and on-budget would improve if we can shed some light into these systems and arm agencies with objective, fact-based insight.

Making the Invisible Visible
Through static code analysis, you can measure your application in real time, and gather unbiased metrics to share both internally and externally.

By getting these metrics right from the product that's being managed and worked on in real time, the data is consistent across all the programs. This independent, unbiased data can then be used to support program decisions around the ongoing management of the application.

When a racing team is tuning a car's engine, the team isn't going to ask the engineer what he thinks and run a race based solely off of that data. It'll fill the engine with sensors and monitor every metric it can grab. If your organization approaches software estimation the same way, you'll create a repository of useful data to show how your IT infrastructure has evolved, and what it will take to bring it to the next level.

Be sure to flip over to Dan Galorath's article on data driven estimation for more information on this topic.

More Stories By Lev Lesokhin

Lev Lesokhin is responsible for CAST's market development, strategy, thought leadership and product marketing worldwide. He has a passion for making customers successful, building the ecosystem, and advancing the state of the art in business technology. Lev comes to CAST from SAP, where he was Director, Global SME Marketing. Prior to SAP, Lev was at the Corporate Executive Board as one of the leaders of the Applications Executive Council, where he worked with the heads of applications organizations at Fortune 1000 companies to identify best management practices.

IoT & Smart Cities Stories
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.