Using Data Management to Drive Business Objectives

At the 19th International Conference on Petroleum Data Integration, Information and Data Management, there was a striking common theme regarding the need to achieve strategic business objectives from data management initiatives.  Many presenters touched on this topic, if not focused their entire presentation around it, including Noah Consulting, EP Energy, ConocoPhillips and CLTech Consulting.

Strategic Data ManagementMatt Tatro of Noah Consulting presented on the idea of applying Lean manufacturing principles to oil and gas data management.  In Lean, one is always considering how to continuously improve and along with that you must have a definition of success.  He made a key point that “everything revolves around the improvements and the resulting impact on the balance sheet.”  Entrance strongly believes in this concept and in our client projects we employ Agile scrum methodology to facilitate continuous improvement.

As other presenters would go on to note later in the conference, management buy-in is key to data management project success.  The best way to ensure to success is to have quantifiable metrics which demonstrate real financial benefits to the organization.  Especially in light of the recent industry downturn, the focus on bottom line results should give the back office all the more reason to ensure that projects are aligned with the strategic objectives of the business.  In the Envision phase of projects, Entrance always works with clients to identify the business value that is ultimately to be achieved at project completion.

EP Energy presented on how they transformed their IT operation to improve results by turning their focus from technology objectives to business objectives.  ConocoPhillips discussed their strategic approach to prioritizing subsurface data workflows with an end goal of accelerating delivery of value to the business.  Entrance sees many firms in the upstream space embracing the idea of IT alignment with strategic business objectives and we structure our projects to do the same.

Jess Kozman of CLTech addressed the issue head-on in his presentation entitled “Rebranding Data Management for Executive Relevance.”  He calls out three particular challenges around the low visibility of data management in most organizations but most significantly in my opinion is the perception of data management (and frequently IT in general) as a cost center and not a value-add to the organization.  At Entrance we pride ourselves on delivering real business value to our clients, not just a technical solution – this is the most exciting part of what we do to drive efficiency in our clients’ businesses.

In conclusion, ensure that your data management initiatives are value-creating projects.  Quantify the expected results up-front and validate them at the end.  Use Lean or Agile principles to enable you to make changes on-the-fly to keep projects well-aligned with strategic business objectives.

PNEC Data Management Conference Recap: The Role of IT in Data Management

The Role of IT in Data Management

At the 2015 PNEC data management conference for the oil and gas industry, one of the themes addressed by multiple presenters was the relationship between IT and the business. In particular, Chris Josefy and Omar Khan from EP Energy hit on three key areas that resonated with me in their presentation entitled “The Case for Transforming the Role of IT to support Upstream E&P Operations”:

  1. Being guided by the business objective, not the technical objective
  2. Driving more value from how the digital oilfield is managed
  3. Focusing on people, process, and data

Business value and strategic objectives are always in focus at Entrance when we engage on a new project so we strongly agree with the approach that EP Energy IT has taken. The technology is merely an enabling tool that allows the business objectives to be reached! It is often easy for the IT staff to focus on the technology (as they should know this the best) and lose sight of why the initiative is being undertaken at all.

In order to fully leverage the digital oilfield investment, you have to talk to the boots on the ground in the field. So often, decisions are made in the corporate office without assessing or understanding how they will impact day-to-day operations in the field. EP Energy made a specific initiative to send IT out into the field to understand where the real value is so that they implemented the most valuable tools in the shortest time. Entrance also believes in this. The photos on are web site are of our own consultants wearing the hardhats!

Ultimately what all of this type of approach means to the technical staff is the focus on people, process and data. People are key to the success of any technology project and Entrance has written and presented on this topic in the past because it has definitely been our experience as well. The process is about how the work gets done, which you will discover by working side-by-side with people in the field. The technology needs to streamline the process, not burden it with cumbersome extra steps, if the technology will be considered successful in the end. Lastly, we can’t ignore the data. Understanding the data as it’s understood and used by the field staff is crucial, otherwise you will build data models and process that are not sustainable.

Keep these points in mind on your next IT project journey!

How Does Software Age?

Out of date software is just as useless as a pile of old computer parts. Thankfully, aging software doesn't have to end up in the scrap heap.

Software applications age over time like any business asset. Unlike tangible assets, such as physical equipment, software deprecation tends to be less easily identifiable or well-defined. Old software tends to “fly under the radar” in businesses, and as a result the associated risks and opportunities are not addressed in a timely manner. Ultimately, all software has to be rewritten in some form eventually to extend its useful life. Software ages in many different ways but today I want to explore one key way that tends to be easy to overlook when understanding the application development landscape. Continue reading “How Does Software Age?”

Why can’t you just tell me how much it’s going to cost?

If you’ve ever engaged a vendor on a software development project, the vendor probably gave you an “estimate.” You may ask, “Why can’t the vendor just give me a pricing schedule or a firm quote?”

Software development is a precise way of automating an imprecise process. It is very hard to use a programming language to translate business rules to a computer, because computers only understand 100% true and 100% false. A business process that only happens “sometimes” quite literally does not compute. If you can describe the rules that govern the process of your business in 100% of scenarios in 100% true/false language (no “maybe” allowed) then it would be theoretically possible to precisely calculate the effort involved in a software project (on a related note, the likelihood of having any bugs would also get pretty close to zero).

Another issue software vendors face when we put together a project estimate is a modern world in which the way you run your business is constantly changing. We’re all fighting a battle just to keep up with changes in the market, staff turnover/competencies, and evolving business processes to maximize operational efficiency. Software projects take time to implement, and it’s increasingly likely that a requirement that made sense at the outset of the project may no longer make sense a few months later when the application is deployed to production.

To combat the challenge of change, Entrance employs an agile software development methodology that allows the software requirements to morph over time as the needs of the business evolve.

The “iron triangle” of project management is well established: scope, schedule and budget (as the saying goes, “pick two”). Entrance typically engages with clients by fixing the schedule and budget, which allows the scope to adapt.

To set schedule and budget for a project, Entrance identifies the make-up of the team needed to achieve the project objectives. By identifying the team in advance, Entrance can predict the project “burn rate” with a high degree of accuracy. If we know how many people are working for how long, then schedule and budget are known.

In addition to setting a budget, Entrance recommends establishing a project contingency budget as well. Typically, we recommend a 20% contingency. The contingency may or may not get used on a particular project but it is not uncommon for a project to go into the contingency budget. If your vendor gives you one number for an estimate (i.e., no contingency or no budget range), then it is advisable for you to add your own 20% contingency for budget planning purposes. Frequently, contingency budget is used on agile projects to address additional scope that may not have been originally anticipated. However, the project sponsor ultimately controls whether or not the value proposition exists to justify moving forward.

By staying engaged with a product owner and project sponsor via semimonthly iterations in our agile process, we have achieved great success in meeting objectives without having a fixed scope up-front. In the agile process, contrary to what you may think, we actually do more planning than in a traditional “waterfall” (all requirements identified up-front) approach. Because we work in time-boxed iterations, you are able to see progress regularly and make small course corrections as we go. By allowing the scope to adapt to your needs we you in the driver’s seat to ensure that the implemented functionality meets your most current business requirements.

 

Conference Recap: PIDX Fall 2014

PIDXlogoStandards as Value Creators in Oil and Gas

I attended the PIDX International US Fall Conference last week in Houston where the theme was “How Standards Accelerate the Realization of Value in Oil & Gas.”  A variety of speakers gave presentations around this topic by presenting real world examples, including Entrance’s President Nate Richards.  I want to highlight some of the key messages that I saw coming out of this gathering of industry experts.


 

Be Proactive Not Reactive

In a joint session, Noble Energy presented with ADP on the ways they use standards to drive global efficiency in their e-invoicing handling.  The message that stood out most to me was that they were being proactive, not reactive, thanks to e-invoicing.  In other words, by leveraging PIDX standards, Noble is able to address issues before they become problems rather than constantly chasing down errors and answers to questions.  It lets Noble ask the questions instead of being the recipient of questions.  The ability to be proactive drives efficiency by reducing time spent on things such as follow-up and re-work.

The Time is Now

A common theme that presented itself across many presentations was the idea that “the time is now” for e-business automation.  The standards are well-established and the technology is robust.  There is a sense that critical mass is being reached as leaner businesses squeeze out inefficiencies in their back offices.  Businesses that are not automating are quickly falling behind the curve as time marches on.  Ultimately, businesses that ignore the momentum toward the automation of trading partner interactions will put themselves at a competitive disadvantage as electronic business transactions become “table stakes” for businesses across the entire oil and gas industry.

Common Business Challenges

As part of a panel discussion, Deloitte elaborated on the value of standards in the oil and gas industry.  Deloitte’s presentation made an important point about the business challenges facing the oil and gas industry.  Specifically, the Deloitte representative highlighted the following challenges:

  • Mergers and Acquisitions
  • Rapid Growth
  • Lack of Integration
  • Technology

The speed of business is constantly accelerating and nowhere is that true more than oil and gas right now.  Data exchange standards provide a common language with which to drive process efficiency which ultimately facilitates M&A and enables rapid growth.  Lack of integration is a historical challenge but this is a clearable hurdle now for those companies willing to make an investment in their future efficiency.  Technology is constantly changing and larger organizations may struggle with their internal IT groups to get the right tools to meet their urgent business needs.

Not Just the Standard Schema

On the technical side, it all comes down to actually making everything work at the end of day.  While standards like PIDX provide a standard schema to facilitate communication between businesses, there is still a significant challenge to be overcome around semantics.  The PIDX standard provides a framework but ultimately each implementer uses the standard in a way that makes sense to it.  There is still much more to be done around defining the meaning of terms.  For example, there is consistently disagreement between organizations and individuals over the definition of what is a “well,” which is such a fundamental aspect of any oil and gas data discussion.  (PPDM has done a lot of work on this particular example, but challenges still remain across the lexicon of oil and gas.)

What’s next?

For businesses in the oil and gas industry looking to drive efficiency in the back office, e-business automation is a proven tool.  If you are interested in learning more how to reduce manual processes, eliminate paper, decrease days sales outstanding (DSO) and drive down costs, then it’s time to talk to Entrance consultants about creating a vision and road map for your company’s software strategy that leverages standards and the latest technology to enable an efficient back office.

 

There Will Be Bugs! | Application Development Blog Series

Best practices for minimizing the impact and mitigating costs associated with fixing software bugs

 Fixing software bugs
First of all, let’s clear something up: what is a bug? A bug is a flaw in software that results in the application not functioning as intended, and is something that is theoretically preventable.

There will be bugs in all of the software applications that you use in your business, whether you custom develop them or buy them “off the shelf.”  Just let that sink in for a moment.

Changes, enhancements, or new features will appear periodically due to changes in the needs of the business, but they are not bugs.  A change (or enhancement) is a result of a new requirement that was not known up-front.  Frequently, owners of custom applications will feel like they have a “bug” because certain business rules are not applied in the desired manner, however these are often changes or undocumented requirements (i.e., not bugs).

Bugs are an inevitable aspect of any software development project, as all software is created by imperfect humans.  There are a variety of techniques that development firms like Entrance use in the application development practice to detect and eliminate bugs before they go to the production environment which helps minimize and mitigate the impact both to end users and the project timeline:

  1. Well-written user stories with acceptance criteria – The most important step in preventing bugs is understanding what the application should do before any code is written.  At Entrance, we create a “user story” format to capture detailed requirements as part of our agile project methodology.  Additionally, it is crucial to capture acceptance criteria so that there is no ambiguity about the desired outcome of the feature.
     
  2. Automated testing – Automated testing is one piece of software testing another piece of software.  It increases the efficiency of the quality assurance process while mitigating the likelihood of regression bugs.  In the same way that business software can make your office more efficient, automated testing allows basic component testing to be performed frequently and very quickly.
     
  3. Quality assurance review – The most fundamental aspect of assuring quality is testing the software by a specifically trained individual who is dedicated to finding bugs before your users do.
     
  4. User acceptance testing (UAT) – The final checkpoint for quality assurance is testing by actual users.  Entrance helps our clients through the UAT process through the creation of the user acceptance criteria and by facilitating the execution of the user testing process.
     

Entrance uses industry-standard tools such as Microsoft’s Team Foundation Server (TFS) to track and manage bugs found in applications that we develop. By tracking bugs in a detailed manner, we can calculate quality metrics and monitor our portfolio of application development projects.  Quality metrics allow us to identify trends, norms, and abnormalities so that we are able to keep all of our projects on track.

The cost of remediating bugs is addressed differently between off-the-shelf and custom applications.  In the Commercial Off-the-Shelf (COTS) environment, you either get “no warranty” (as-is) or you pay an annual “maintenance fee” (usually in the neighborhood of 20% of the software’s cost).  If you’re paying a “maintenance fee” then you’re pre-paying for bug fixes (think of it as bug insurance).  In the custom development world, as the application owner you pay for bugs but the cost is typically recognized incrementally as the bug appear over time.

There are different ways to manage the cost of remediating bugs that make it through to production.

  1. Warranty – Warranties are not common in the software world due to the complex nature of business applications.  Custom software may be warrantied in certain situations, most commonly in a fixed-fee project where the vendor agrees to implement specific functionally for a firm price.  A warranty might also be offered for a separate fee as an option.  If a warranty is offered, expect the vendor to be very particular about what is or is not covered by the agreement.
     
  2. Time and materials – In a time and materials scenario, the software owner will engage the vendor after a defect is identified and will be subject to the availability of the vendor at the time service is requested.  This option exposes the software owner to the most risk and is generally only advisable for software that is plainly not mission critical.
     
  3. Retainer – Retainers tend to offer the best balance of cost and risk mitigation for most software owners.  A retainer relationship with a vendor guarantees the availability of the vendor up to a certain number of hours (typically per month) for a fixed fee and may provide a discount over base pricing when pre-committing to a sizeable volume.  Additional hours beyond the monthly hours included in the retainer fee are typically offered on a time and materials basis subject to the availability of the vendor.  The main advantage of a retainer is that you can be assured that the vendor will be available to address any business critical issues that may arise in a given month.  Depending on the structure of the retainer, hours not used for bug fixes or support may be available for other efforts such as enhancements.  Prepaid hours do not roll-over to the next month because the vendor has already committed resources in advance.
     

 

How custom software ages | Application Development Video Series, Episode 1

Not unlike hardware, custom-built software applications can age over time. There are a variety of ways that a business can outgrow a software application, and those aging applications can become more burdensome to maintain than to update or rewrite.

We don’t typically think of software as something that can age. The code is written and run on a computer, so it doesn’t age like hardware does or like people do. It becomes obsolete or becomes less functional, unable to keep up with the demands of the business and the business users. Software is usually written based on a snapshot of a business need for a certain period or range of time. Unfortunately, software doesn’t change dynamically the way the business does. Businesses can change, market demands can change, the number of users can change, and the software can have trouble keeping up with that.

Another aspect of aging is that the software may have been built in a technology that is no longer supported or is slowly on its way out the door; it’s being deprecated or replaced by more modern design patterns and technologies. The software was written years ago, and the technical resources are no longer available or difficult to find. When you can find them, they are expensive, which makes maintaining the software more and more costly.

Technologies, design patterns, and understanding of software as a functional piece of a business were limited 10-15 year ago, and that technology continues to evolve. When we think about legacy applications, they were monolithic in nature and written top-to-bottom; every line of code was executed in one lump sum. To change one little thing in those applications, you had to change everything.  Thankfully, now we have better paradigms alongside better technologies where we can separate the different pieces of functionality and objectives into multiple layers.

  • We can have part of the application that is written specifically to manage the database.
  • We can have another piece that manages business rules and validation.
  • We can have another piece that’s a service layer that allows you to integrate other systems and software, preserving the code that’s already in place for business logic and the database.
  • We also have the user interface and front end of the database. This part is also changing: it used to be just PC-based, but now you’re going to want to think about new devices like GPS, tablets, and cell phones so people can access your software anywhere in the world.

We begin to realize there is an aging process that happens with software — as it ages, it becomes more difficult and expensive to maintain in addition to some of the lost opportunities for growth. For instance, older software wasn’t designed to take advantage of the hardware that you’re probably already using which has multiple core processors and robust memory capabilities. Bringing the software up to date will give you the opportunity to take advantage of those hardware options for better performance.

Software Modernization: When Is It Time?

Increase The Scalability And Efficiency Of Your Oilfield Services With Field Data Capture Software

Oilfield Workers on Laptop iiDon’t let your current time-keeping and work-order processes restrict your growth potential. Streamline and automate these critical tasks with field data capture software.

With the US onshore rig count level holding at ~1,710, 2014 is shaping up to be another banner year for North American oilfield services companies. Are you capturing your share of this business?

Most oilfield professionals we engage with are coming to the realization that they’re leaving money on the table.  Their growth is being stunted or unrealized due to antiquated processes and systems.   The data capture tools and processes currently in the field are simply no longer adequate to meet their future growth plans and the elevated expectations of their clients.

In this blog, I’ll address how using paper-based forms for time-keeping and work-orders in the field – while familiar and convenient – slow down time-sensitive workflows, hinder reporting, and create data accuracy and integrity issues that negatively impact the payroll and invoicing work streams.  I will also address how a Field Data Capture solution solves these challenges.

Poor data quality – the scourge of Finance and AR

Do you rely on hours-worked documentation from field operations staff to drive invoicing and payroll? Our services industry clients do, so we know the pain of managing invoicing and payroll when that data is missing, incomplete, inaccurate, or untimely.  The impact of poor data quality for hours-worked ripples throughout the organization from the CFO’s concern for Days Sales Outstanding (DSO) to the Accounts Receivable and Payroll clerks that are striving to do their jobs accurately and efficiently.  Electronic field data capture can add scalability and efficiency in the back office AR and Payroll departments.

Five critical questions for AR and payroll teams

AR and Payroll staff must always be on the ball due to the cyclical rhythm of the processes they own in the business.  Whether your processes execute as frequently as weekly or as infrequently as monthly, the obligations of these departments do not change.  As a services firm, AR and Payroll are arguably two of the most critical back office functions.  These questions will help you assesthe effectiveness of your own processes:

  • Does your data capture process  differentiate between hours billed and hours paid?  In order to ensure accuracy in the back office activities, it is critical that your system of record clearly distinguish between hours to be invoiced and hours to be paid (especially for hourly employees).

 

  • How long does it take for your AR/payroll departments to get notified once work is performed? Is it weeks, days, hours, minutes or seconds?  Yes, “seconds” is a possible answer, if you have a mobile-enabled electronic field data capture system in place.

 

  • How long does your back office take to process invoices or payroll once they’ve received time records? e time records until invoices or payroll are processed?  How much time does your back office staff spend performing data entry, rekeying into multiple systems or manually creating Excel files?  Automation of the static and repetitive invoice generation and payroll processing functions can make your back office staff much more efficient so they can get more done in less time.  In our experience, automation solutions never replace jobs but rather they let the existing staff be more effective.  The efficiency created by automation reduces the need to hire additional staff as the business scales up.  Additionally, an often overlooked benefit is that automation keeps the staff you do have much happier with their jobs.  The productivity increases enabled by automation allow humans to focus on the non-automatable processes that naturally occur all the time that cannot be automated away.

 

  • How often do  you receive complaints about inaccurate invoices or payroll inaccuracies?  Manual human processes are naturally error-prone even for the most diligent clerk.  Automating manual processes means that you set the business rules one-time, up-front and don’t have to worry about the process or math being done the same every cycle.

 

  • Is there one critical person in your business process that controls everything?  If you’re AR or payroll clerk goes on vacation, gets sick or retires, are your business processes able to execute without interruption?  Process automation reduces the risk to the organization by allowing the business to continue executing as usual even when the personnel change.

 

Attend our free seminar to learn more!

Hopefully, I’ve convinced you that arming your field workers with new data capture software and mobile devices will dramatically improve information flow between the field and your back-office, resulting in more efficient and scalable processes that enable rather than hinder your workforce from supporting more clients and work.  While I focused specifically on time keeping and work orders, a robust Field Data Capture solution also provides similar benefits when creating inspection reports and asset integrity management.

If you are interested in learning more about how to jump start your own Field Data Capture improvement initiative and are already starting to consider whether a custom and/or packaged data capture solution is the right approach, I highly recommend that you attend our FREE lunch seminar at III Forks on July 17th where Nate Richards will talk provide an overview on Field Data Capture Solutions for Oilfield services.

LEARN MORE

GETTING STARTED VIDEO:  ELECTRONIC FIELD TICKETING

CASE STUDY: NA PIPELINE SERVICES COMPANY INCREASES OPERATIONAL VISIBILITY

SOLUTION PAGE: ELECTRONIC FIELD TICKETING CONNECTS FIELD TO CORNER OFFICE

3 Signs that your data is trapped and 3 simple steps to unleash it

I attended the 18th International Conference on Petroleum Data Integration, Information and Data Management (commonly known as the “PNEC” conference) on May 20-22, 2014.  As I reflected on the various papers and talks, it was clear that “Big Data” is both a significant challenge and an amazing opportunity for upstream oil and gas.  I was inspired by a talk on the topic of Big Data and it got me thinking about how you actually go about unlocking the power of Big Data.

3 signs that your data is trapped

  1. Too much data
    Consumers of data in the business typically lack the technical skill set required for data integration and analysis.  In order to effectively mine your data, you must be able to assemble the data in a readily consumable format.  Due to the volume of data being generated daily in a modern upstream oil and gas operation, there is so much data any one business user won’t know what exists, where it is or how to get it.
  2. Too many repositories
    As a result of the volume of data and the specialization of various departments (e.g., G&G, drilling, production, accounting, etc.) each department typically has its own preferred system of record.  While having specialized systems is beneficial, if not outright required, it results in “stovepipe systems” or data silos.  Additionally, because common business metadata is logically shared between departments, data is duplicated and data quality suffers.  With no single version of the truth, it is difficult to ascertain what a well name is, what the spud date was or in many cases what even a “well” is (e.g., a completion, a wellbore, a wellbore completion, etc.).
  3. Unable to analyze effectively (can’t ask the “What if…” questions)
    Even if you have all of your data in one place, making it accessible to the business users in an easily consumable format requires specialized Business Intelligence software.  A modern BI platform allows you to put the power in the business users’ hands to get the insights they need without requiring IT to spoon-feed the information.  IT’s responsibility should be to put a solid and extensible platform in place to empower the creativity and natural curiosity of the subject matter experts within the business.

How do you unlock the value of your information assets?

  1. Identify the systems of record
    To get started, you have to know the systems in which the data resides and how it gets into those systems. This is a discovery process that is required because even if you think you know where everything is, more than likely you don’t have the full picture. Keep in mind that vast quantities of data may be locked-up in spreadsheet files on a user’s desktop or even on paper in a remote field office!
  2. Create an integrated repository
    Once you’ve identified the systems of record, the next step is to build a “Master Data Management” (MDM) solution that aggregates and synchronizes all of the key business metadata.  In addition, a comprehensive data warehouse should be layered on top of the MDM solution to provide a single location against which aggregate business intelligence analysis may be performed.  It is important to leverage standards when embarking on MDM and data warehousing projects (I talk about that more in another blog post).
  3. Use a modern Business Intelligence tool
    Excel, while it remains the Swiss Army Knife of business software, does not provide the powerful data visualization capabilities of a modern BI tool.  There are many great tools in the BI software space today.  Applications such as Tableau, Spotfire, Power BI and many others put business users in the driver’s seat with powerful yet easy-to-use data visualization features.  However, by themselves the usefulness of these tools is highly limited unless you’ve done your homework to build out MDM and a data warehouse for your business.  Ultimately, you want the business to be agile, i.e., able to quickly adapt and change direction based on insights from the organization’s collective data.  When the business users can ask the “what if” questions and easily get answers, then you know you’ve reached BI Zen.

What does all of this mean in the end?

The ability to efficiently aggregate and analyze data is the goal of the “big data” movement. Buzzwords aside, ask yourself if you are extracting all of the insights that your data can provide. If not, then it’s time to Frac Your Data and Produce Insight. The energy software experts at Entrance have the experience and skills you need to unleash your trapped data.

How to leverage standards in the Oil and Gas Industry

The oil and gas industry is both blessed and cursed by the proliferation of various standards for data organization and exchange.  Standards are vital to efficiency in creating and managing a master data management or system integration initiative in any industry, none more so than oil and gas.  The Standards Leadership Council is a roll-up organization that includes represents the industry standards leaders:

Standards are leveraged in two key ways:

  1. Creating a single version of the truth within your business (master data management).
  2. Facilitating inter-system data exchange within your business or between multiple businesses (system integration).

Effectively implementing Master Data Management necessarily involves system integration so it is important to consider both challenges together.  PPDM’s data model is the de facto standard for oil and gas data management.

How do you get started?

Oil and gas master data management always starts with:

  1.  Well naming and identification
    Human-friendly “text” names for wells frequently vary depending on the person, department or software being referenced and are never a good enterprise data key regardless of industry.  Standards such as the API Well Number are very useful but don’t fulfill the true need of a globally unique identifier because API well numbers are typically only assigned after permitting.  How does your organization uniquely identify a well in its lifecycle before it is permitted?  A surrogate key must be established that is guaranteed to be unique.  Furthermore, you must establish a cross-reference table to disambiguate keys across all systems including your master data management solution.  Have you identified all of the source systems in your organization, what they are used for and what data they share?
  2. Well data hierarchy
    Have you asked the question “What is a well?” in your organization?  If so, how many different answers did you get?  Probably more than one.  PPDM has led the way in defining industry standards for the well taxonomy.  Have you thought about the difference between Well Origin, Wellbore, Wellbore Segment, Wellbore Contact Interval, Wellbore Completion and Wellhead Stream?  The beauty of thinking about wells in this way is that necessarily every well has only one valid “Well Origin” at any point in time – this may seem simple but it is vital to executing proper oil and gas data management.

Where do you go from here?

It depends on what your specific business objectives are for the master data management initiative.  Usually, normalizing the concept of “well status” is near the top of the list because status changes in a well (or wellbore or wellbore completion, etc.) may be a workflow trigger in the lifecycle of the asset (e.g., is a lease provision triggered due to spudding or completion).

While the world of oil and gas standards may be daunting to contemplate, software experts such as those of us at Entrance can show you the way.