PPDM – Professional Petroleum Data Modeling

Personal Computers for Oil & Gas Data Management

Cast your mind back, for a moment, to 1988. Ronald Reagan was leaving the White House, Rain Man was the year’s biggest hit, and mainframe computers were being challenged by PCs.

Although politics and Hollywood might be getting all the public attention, it was the transformation of the computer industry that held far more challenges and consequences, especially for the petroleum industry. Thanks to advances in a broad swath of exploration and development technologies, oil companies were gathering and interpreting far larger volumes of data than ever before. For the last few decades, they had relied on mainframe computers, proprietary software and in-house staff to handle the work. It was increasingly becoming clear to E&P companies, however, that there were benefits from employing workstations, including improved graphics and user interactivity.

The instigation for what would eventually become the PPDM Association actually began several years earlier, in 1984, when Gulf Corporation had been purchased by Chevron and terminated its available services to its subsidiary Gulf Canada. At the time, Mel Huszti was the coordinator of exploration mapping systems at Gulf Canada, in Calgary. “We had been dabbling with workstations and had similar systems between Gulf Canada and Gulf Corporation in Houston,” he recalls. Most of Gulf Canada’s workload continued to be processed on Unisys and IBM centralized systems, but in October, 1988, Gulf Canada announced it was phasing out its mainframe Unisys computer. Due to lower oil prices, their 1989 budget was also reduced, and there would be less emphasis on western Canada exploration activities. However, the company’s emphasis on mapping and workstations would be increased. The target date for phasing out the Unisys computer was July, 1989.

The relocation of Gulf Canada’s E&P mapping from the Unisys computer involved:

•Improvements to the geophysical mapping system that used the IBM MVS operating system;

•Migration of the mainframe well data base to a new data base environment;

•Replacement of Gulf Canada’s existing base mapping software and development of an appropriate interactive user interface software;

•Transfer of the existing project data base structure into the workstation environment. This suggested opportunities to involve 3rd party software developers in joint development of particular applications.

The scope of the project database structure had to accommodate storage for general well header information, digital well log data, seismic location information, seismic data, land parcel information, reservoir field/pool information, faults and formations, geographical information, and surface grids.

By December 1988, the Gulf team had decided that it would move the various exploration data processing activities to a network of several Unix workstations –Sun 3/60, with Sun 4/280 and Tektronix 4335 machines. The intent was to use the most appropriate hardware for the various exploration tasks and ensure that all the software was, as much as possible, portable across the network.

In conjunction with an “open” hardware strategy Gulf envisioned an “open” software strategy that would make use of an industry standard data base and network products. From an exploration application perspective it would be desirable to plug into an existing 3rd party package which would provide general data base services and allow Gulf to add any specialized and proprietary products.
The Drive to a common data model

To achieve these goals, Huszti needed a common data model that would allow Gulf Canada to run geoscience and engineering software applications to access the same data from a consistent data structure. At the time, a host of small software firms in Calgary were experimenting with new platforms, databases and software that could run on desktop hardware. These firms included Applied Terravision Systems (which had been working to use the Oracle database for their oilpatch applications), Finder Graphics (which had developed interpretation software that relied on integrated well and seismic data), and Digitech (which had developed software to integrate public data sources and distribute the result to their oilpatch clients). Pat Rhynes was a founder of Applied Terravision Systems. “We were approached by Mel Huszti in 1988,” he recalls. “Gulf Canada needed a data model that would support its move from mainframes to workstations. I wanted to it to be an open model because, as Mel said, “If any company goes into receivership, I want the data model to reside in the public domain.”

Starting such a project in Alberta had another advantage. The Energy Resources Conservation Board (ERCB) had legislated requirements for submitting well data. “They had established standards for the data decades before the advent of PPDM,” says Yogi Schulz, an industry consultant and long-time PPDM Association board member. “One of the main reasons that PPDM started here is because Alberta had more encompassing standards than other oil centers around the world. It was the fruit of necessity.”

Gulf Canada negotiated a joint software development venture between Gulf Canada, ESI (Finder Graphics), Digitech and Applied Terravision. Then Huszti put together a brainstorming team to come up with a prototype data model. The team included Davis Swan from Gulf, Jim Sharuk from Digitech, John Gillespie of Finder Graphics and Pat Rhynes from Applied Terravision. “Mel put us all in a room and told us we couldn’t come out until we had something,” says Rhynes. “It took us about a week to finish our first module. Frankly, it was awful.”

But it was a start. In the following months, a dozen experts focused on making the system work. “Digitech had to convert its data format to accommodate the new model,” says Huszti. “Both Applied Terravision and Finder had to convert their application software to Oracle in the new data format. The hardest part for Gulf Canada was to have software which extracted data from an IBM mainframe and transferred the data into an Oracle database sitting on a Unix work station. Unless you had that in place, the data wouldn’t transfer across. It took us six months to de-bug the communication software.”

The team plowed on, and by the following year it had a functioning system that would take information from thousands of wells, combine it with seismic shotpoint locations, and display it on a map at a workstation. “It was a win-win-win situation,” says Huszti. “The company cut costs, removed the Unisys mainframe and improved the use of computer technology.”
Overwhelming Immediate Interest

The team was eager to share its invention and ask for industry input. “We were able to present a paper at a Geobyte conference in late 1989,” recalls Rhynes. In addition, Applied Terravision exhibited at the SEG conference in San Francisco that year. The data model was on display and Applied Terravision was giving the model away for the cost of the binders. Bob Tretiak couldn’t believe the reactions. “People were just grabbing the binder and throwing cash on the table”, says Bob. The response, to Applied Terravision’s surprise, was overwhelming. “We sent over one hundred floppy disks worldwide for $100 each. People were really excited; you have to remember, it was the late 1980s and people were astounded that someone was actually sharing a data model and asking for help improving it. There had been very little discussion of open standards prior to this. I don’t think we realized how much the oil & gas industry would jump on it.”

In retrospect, oil & gas industry demand for an open data model seems almost inevitable. An open data model eliminates the need to develop, evolve and maintain internal data models and related custom data management software. It reduces take-up time for new software applications, lowers systems costs to update and maintain information, and improves the quality, quantity and timeliness of information. It clarifies data ownership. It reduces risk through improved reliability with clear, concise data definitions. It minimizes data transfer difficulties between software applications or multiple databases. Using an open data model not only reduces costs, but positions companies to become more productive, efficient and competitive.

But creating a standard model for the petroleum industry that was accepted and actively employed by many diverse parties was a task beyond the scope and capabilities of any one company or a government agency. Some form of organization was needed that could transcend commercial and bureaucratic boundaries. “It made sense to start a neutral, non-profit organization,” says Rhynes. In 1989, Mel Huszti and Bob Tretiak, president of Applied Terravision, created the Public Petroleum Data Model (PPDM) User Group. The goal was to create an open, business driven data model for the petroleum industry with the participation of volunteers from a broad range of companies, government agencies, vendors and service companies. “We got a lot of volunteers soon after,” says Rhynes. “No one had a vested interest.”

http://www.ppdm.org/

Posted in Uncategorized | Tagged | Leave a comment

Hello world!

Welcome to WordPress.com. After you read this, you should delete and write your own post, with a new title above. Or hit Add New on the left (of the admin dashboard) to start a fresh post.

Here are some suggestions for your first post.

  1. You can find new ideas for what to blog about by reading the Daily Post.
  2. Add PressThis to your browser. It creates a new blog post for you about any interesting  page you read on the web.
  3. Make some changes to this page, and then hit preview on the right. You can always preview any post or edit it before you share it to the world.
Posted in Uncategorized | 1 Comment