Welcome!

XML Authors: Andreas Grabner, Carmen Gonzalez, Kevin Benedict, Elizabeth White, Rich Waidmann

Related Topics: XML

XML: Article

Use a Native XML Database for Your XML Data

Deciding when an XQuery-based native XML database is better than an SQL database

Developers tend to use the most familiar technologies. For data storage, that is the relational database. During design it's easy to see tables of data everywhere; however, not everything is relational in nature. When dealing with XML data or data easily expressed as XML, XQuery-based native XML databases (NXDs) present a viable and cost-effective alternative to relational databases, file system storage, or custom developed storage implementations. So, when is it time to consider an NXD?

Can native XML databases really provide a better answer for your data storage needs? In this article we'll examine some guidelines to help answer that question.

When to Consider a Native XML Database

  1. Do you have thousands of XML files?
  2. Is your XML data larger than 200MB?
  3. Are you trying to build a hierarchy into tables?
  4. Could your data change over time?
  5. Have you spent more that $100 on books explaining SQLXML?
The File System Isn't a Database
The first two questions are practical in nature. If estimates indicate more than 1000 XML files or 200MB of XML data exist, the file system isn't the right tool for the job. File systems are not built to manage large numbers of files in a single directory or huge directory hierarchies. Managing concurrency, out of disk space conditions, and other common problems will plague your application unless you use a database.

Naturally Hierarchical
The third reason to consider an NXD really has to do with the impedance mismatch between relational databases and XML data. Fundamentally, XML data is hierarchical and is a poor match for a relational database's rows and columns. Relational databases have always had a hard time modeling hierarchies. You'll find dozens of workarounds for this, but no real simple and efficient solution. Any XML-to-relational mapping tool will have to pick one of these techniques and manage this common case as best as possible. Regardless of the solution, performance will suffer. The end result will also be more brittle over time. Changing the structure of the data - which is easy, natural, and useful to do in XML - forces a redesign of the relational database and changes to the mapping layer. A single mapping mistake could completely skew an entire data set. As an XML document structure changes over time, it's possible for attributes to become elements, and vice versa. Over time, incremental changes to the logical structure of the XML data can force physical changes to the database, and wholesale dump and reload may be required of the relational system to keep pace.

Structured Yet Flexible Data
The fourth question suggests that requirements change over time, something true of most real world business systems. Once a relational database schema has been set in stone, only the database administrator (DBA) is qualified to change it without disrupting services. The contract between a relational database and the program that use it becomes the weakest link in the system. As requirements change, the DBA will spend endless hours mitigating the issues that arise. Contrast that with XML databases. XML is both structured and flexible. Even XML documents conforming to a DTD or XML Schema maintain a high degree of flexibility when compared to relational schemas. Most NXDs will optionally validate document structure. Even when document validation is not enforced, XML documents maintain a high degree of implicit structure. Therefore in either case, XML documents are flexible and structured and as a result the contract between the database and the programs using it is not brittle. It can withstand change without requiring a DBA.

Data Mapping Is Wasted Time, Money, and Effort
The last question is really a reality check. Take a second to consider the amount of time and money you've spent trying to make a solution workable. The best thing you can do when digging a hole is to stop digging and get out of the hole. With that in mind, let's look at SQLXML. If you need to mix and match SQL data and XML data, it's not a horrible way to go. However, if you view it as a way to squeeze XML into a relational system in which you've already invested, you might want to reconsider. You are going to pay a performance penalty for every document stored in terms of CPU and memory. Your ability to query, index, and optimize will be impacted as well. Executing XQuery against XML data mapped into relational tables will be hindered by the non-native storage format. An optimized native XML database won't have the same penalty.

Let's take a look at how to use an NXD to learn more about its advantages over a relational database.

Examples and Code
Berkeley DB XML is an open source XQuery implementation built atop the Berkeley DB transactional database system. It supports optimized XML storage, XQuery query planning, massive scale and concurrency, and is available for download as source code for multiple platforms and as a Windows installer from Sleepycat Software (www.sleepycat.com). Berkeley DB XML has also been integrated with Stylus Studio if you're more comfortable using an IDE for development. Berkeley DB XML is readily available to anyone, so we'll use it for the following examples.

First let's create an in-memory container for XML documents. Follow along using the "dbxml" command line provided with Berkeley DB XML.


dbxml> createContainer ""
dbxml> putDocument myDoc <names><name>joe</name><name>fred</
name><name>jane</name></names>
dbxml> query collection('')/names/name[.='joe']
dbxml> print
<name>joe</name>
Line 1 creates the container, while line 2 places a simple document into the container. The third line performs a query on the container returning each document that matches the XPath query /names/name[.=joe]. Finally it displays what we returned by the query statement.

That's all quite useful, but let's say that you need to access your XML storage programmatically. Because Berkeley DB XML is a library, and as such is linked into your application just as any other library would be, it does not incur the overhead of client/server communication. You interact with Berkeley DB XML using one of the supported language APIs. The primary one is C++, as the product is written in C++. Java, Python, Perl, PHP, and TCL are all supported API languages. Many other languages are supported by third parties and are readily available on the Internet.

With that in mind, let's next try some simple C++ code that calls for the same thing as the dbxml commands used in the previous example (see Listing 1).

The underlined sections of the code relate back to the first example. The first underlined section creates a container. This container is called 'test.dbxml,' and is on disk, and rather than in memory. The next underlined section places the same simple XML document into the new container. The last underlined section issues the query, and the while loop equates to the print statement. Put it all together and you have essentially the same result as before. In Java the code looks much the same; again the underlined areas are the common key sections of code (see Listing 2).

As you can see, the API is fairly straightforward and similar across languages. You'll note that the C++ and Java examples create a database on disk rather than in memory simply by giving the container a name.

Let's move back to the dbxml shell and try a more complicated example. This time let's add a few documents, so we can explore the performance of the system.

First let's populate an imaginary parts database.


dbxml> createContainer parts
Creating document storage container
That created an empty container and opened it as the default container in the shell. We can now use the putDocument command to run our XQuery and insert the sample data.

More Stories By Gregory Burd

Gregory Burd is the Product Manager for Sleepycat Software, now a part of Oracle. Prior to Sleepycat, he was on the business team at KnowNow, a Kleiner Perkins startup in the San Francisco Bay Area. He has many years of software development and product leadership within companies such as JavaSoft, a division of Sun Microsystems, Marble Associates, a consulting company, and NeXT Computer, now part of Apple Computer.

More Stories By Kimbro Staken

Kimbro Staken is an independent consultant, author, and open source developer specializing in technologies for XML data management. He is one of the primary developers of the dbXML Core Open Source native XML database and a cofounder of the XML:DB Initiative.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
“With easy-to-use SDKs for Atmel’s platforms, IoT developers can now reap the benefits of realtime communication, and bypass the security pitfalls and configuration complexities that put IoT deployments at risk,” said Todd Greene, founder & CEO of PubNub. PubNub will team with Atmel at CES 2015 to launch full SDK support for Atmel’s MCU, MPU, and Wireless SoC platforms. Atmel developers now have access to PubNub’s secure Publish/Subscribe messaging with guaranteed ¼ second latencies across PubNub’s 14 global points-of-presence. PubNub delivers secure communication through firewalls, proxy ser...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.