Creating Contour Plots in Spotfire for Drilling Optimization

Utilizing contour plots in Spotfire to simplify drilling plans

With custom contour plots in Spotfire, drilling engineers can change the values of any two independent parameters (X & Y axes) to find the combination of those values that produce the best results for a third (Z). In this case, we look at the combination of bit RPM and weight on bit to see which combination produces the best rate of penetration for this type of formation.

Contour plots in Spotfire for drilling engineers

Typically, in order to recognize patterns that would help optimize the plan to drill a new well, drilling engineers need to compare two variables across multiple Excel spreadsheets to determine an intersect value. Additionally, they have to create a new spreadsheet for each set of variables. This works, but it takes a lot of extra time that the engineer could spend making decisions. This can delay the start of drilling a well, and the manual manipulation of the data introduces the risk of errors that could create problems during drilling.

Using a visualization tool like Spotfire paired with TERR Engine, we can use the information gathered from previously drilled wells in the same/similar formation to create Spotfire contour plots which help us evaluate similar drilling parameters and formulate an ideal drilling plan for a future well. There’s no question that this helps reduce the amount of time it takes to start producing both prior to and during drilling.

The uses for a contour plot in Spotfire extend well beyond typical drilling parameters. Our clients find that they can use this for evaluating other performance variables for things like health & safety of workers by looking at the number of hours on the job versus the number of accidents to determine correlation and reduce risk in the future.

The initial creation of the contour plot in Spotfire usually requires advanced understanding of the Spotfire platform. Entrance’s Spotfire consultants can typically get this set up for a client in a couple of weeks, after which it will be incredibly easy for you to swap out variables for new drilling plans.

Call me today to get a custom TIBCO Spotfire contour plot for your team:
1-888-343-KNOW

Self-Service Analytics for Finance and Accounting

Analyzing electronic document

Chances are, business intelligence is on every accountant’s Christmas list this year.

The energy industry of 2015 is shaping up to be a world of slim margins and fast turnaround to allow producers, distributors and marketers to maximize the effect of their smaller capital budgets. Accounting and finance departments will feel this pressure unlike any other. Between processing and paying invoices, regulatory reporting, and the myriad of one-off financial reports that executives need, every precious second counts in the day of a finance department.

In 2015, those seconds are even more precious. Invoices need to be paid faster, cash needs to be freed up quicker, and overrun costs need to be visible sooner.

So if you’re an IT manager or CIO trying to pick out a holiday gift for that certain special accounting department, consider a self-service analytics platform.

What is “Self-Service Analytics?”

Let’s consider the maturity of an organization based on how it uses data.

Self-service analytics is, like the name suggests, an ecosystem of empowered users accessing data from across several line-of-business systems in one place, doing analysis on that data using a robust tool and collaborating around the results in a central location. It’s the pinnacle of organizational data maturity, without delving into predictive and prescriptive analytics.

There are many challenges to implementing self-service analytics, which is why organizations turn to business intelligence consultants to craft a business intelligence strategy and implement it using industry best-practices. The technical challenges are real, but the biggest challenge is organizational. Changing the way users think about gathering data and how they use it on a day-to-day basis is a process that requires experience and careful planning.

For an individual, the transition from mashing up line-of-business reports in Excel to creating analyses and dashboards in a tool like Spotfire or Tableau requires three key things:

* Familiarity with the data

* Knowledge of the platform

* Confidence in the solution

Without those three things, self-service analytics will have a very hard time growing across an organization. In implementing a BI strategy with self-service analytics in mind, it is up to BI consultants to leverage an organization’s familiarity with the data while creating and training users on the platform and, eventually, building user confidence.

Finance and Accounting: Champions of Self-Service Analytics

Many folks in upstream energy consider geophysicists and production engineers to be the most data-savvy folks in the office, but I think that honor goes to accounting. They simply do so much with so little, and with an impact that can be felt across all other departments.

Accountants and financial analysts live each day surrounded by spreadsheets—static spreadsheets and grids of data. These people know both the data and the financial workflows of a business in and out. And they’re generally open to learning new technology platforms—especially platforms that might free them from mashing together Excel spreadsheets by hand.

Generally speaking, finance and accounting love analytics, hate the limitations of the tools they’re given and would like an easier way to do analysis. They can take a self-service analytics platform and turn it into real, noticeable ROI without the overhead of explaining how the data works.

The Impact of Self-Service Analytics

Let’s consider an example of how self-service analytics can transform a finance department:

Laurie is a financial analyst at a Houston, Texas independent oil and gas producer. She’s been tasked with tracking how vendor costs have changed over time. The COO also wants her to produce a model that can be used to aid in selecting new vendors for 2015. But even with access to the company’s ERP system and general ledger, Laurie still has to mash the data up by hand using reports that are only available as flat files delivered via e-mail.

Every month, Laurie (along with the rest of the accounting department) loses an entire workweek to booking accruals and closing the books for the accounting period. She’s also one of a few individuals who has access to both the general ledger and the ERP system, so the CFO and COO are constantly asking her to create Excel charts to be used in presentations to the board.

One day, early in 2015, she receives an e-mail asking her to attend a lunch-and-learn training delivered by Entrance about how to use Spotfire for financial analysis. She’s never heard of Entrance, but she has heard of Spotfire and she knew that a BI was “in the works” at her employer.

Laurie learns that Entrance has set up Spotfire Server as the new go-to place for data across the organization. Data from the ERP, general ledger, production accounting, and project management systems are all ready to use. Just having the information links published to Spotfire Server and available in one consolidated place cuts down on the time spent mashing the data by hand.

Then, she watches the consultant from Entrance create a new Spotfire analysis. He pulls together data from three systems, visualizes estimate costs and posted invoice amounts by service type over service dates, and has created the foundation for Laurie’s analysis. Each page allows her to drill down from BU level to supervisor to supplier. She can tag items for review by others. She can bookmark steps in her analysis and forward them to her peers and managers. Knowing the data, learning the tool, she gains confidence in the process and solutions.

Armed with her new analytics tool, Laurie is able to quickly build up a first pass at the analysis she was asked to create. She publishes the analysis to the server for a few other members of the finance team to analyze. They offer some suggestions and Laurie revises her analysis before sending a link to the CFO.

The CFO is very impressed. It’s clear that a few mud pump vendors are linked directly to higher maintenance costs, and so the CFO calls the COO into the office to learn more about Laurie’s methods. Pretty soon, her contemporaries across the organization are using her analysis as a basis for creating vendor cost reports.

Stories like Laurie’s play out across the energy industry every day, with empowered individuals uncovering new trends and building new insight with self-service analytics. Data integration and visualization are important parts of a business intelligence solution, sure, but self-service analytics is also an important part of the picture. It is a very exciting way to leverage the expertise and creativity of traditional data consumers to multiply the effect of business intelligence throughout an organization.

 

 

5 Key Takeaways from TIBCO NOW

The Entrance BI Practice was well-represented at this year’s TIBCO NOW conference, which featured keynote presentations by disruptive thought leaders like VC legend Marc Andreessen, TIBCO Founder and CEO Vivek Ranadive and bestselling author Malcolm Gladwell.

Five things stand out in my mind as I look back upon the week so far:

1. TIBCO Jaspersoft is bringing BI to the masses

· I have to admit, my experience as a BI consultant building self-service analysis tools on the Spotfire platform left me wondering if there is any room for “pixel-perfect” reporting in the BI landscape. TIBCO addressed these concerns at the conference by clearly demonstrating how Jaspersoft will play a very important role as BI expands from the top of an organization down to the users who have traditionally been on the “fringe.” Jaspersoft empowers developers to drop reporting and BI controls directly into their applications–just like the graphing controls of old–but with the added bonus of flexibility. End users can dive into the data, change the presentation of a visualization, and generate reports for day-to-day use. This is huge. “Data democracy” is coming, and TIBCO is leading the revolution with Jaspersoft.

2. Centers of Excellence are a force multiplier in business intelligence

· Business intelligence consultants such as myself have an important role to play in demonstrating best practices, developing solutions and teaching organizations how to start taking ownership of their data. But a truly successful solution features a community within the organization itself: A Center of Excellence. These power users see the value of data and know how to help grow BI in every direction.

3. Data isn’t just big, it’s fast

· Many of TIBCO’s breakout sessions this week focused on the rise of “fast data” in a truly real-time sense. This has implications throughout every enterprise, and oil and gas is no different. Imagine being able to predict pump failures before they occur, and show deviations from normal operation to an engineer who can act before maintenance turns into replacement. TIBCO’s StreamBase working in concert with Spotfire gives upstream the power to create models and apply them to live data. That’s the power of fast data and real-time event processing.

4. Integration isn’t a luxury, it’s a must

· Business processes and the applications that drive them generate a wealth of data that cannot be turned into actionable information without integration. Whether you’re a multinational player in energy or an independent focused on the next big asset play, making the right decisions requires input from every department working in concert.

Making those decisions in a timely manner demands integration from all departments. And master data management doesn’t hurt, either!

5. Big things are on the horizon

· Sometimes, you need to turn humongous swaths of data into actionable information– we’re talking about terabytes and petabytes that simply won’t fit into a traditional in-memory database. IT has the power to make this a reality today, by leveraging the power of the cloud in concert with its on-prem assets. Platforms such as Apache Spark and NoSQL databases like MongoDB are empowering organizations to operate on this sort of truly big data. But leaving the data in the cloud (say, in Microsoft Azure) allows IT to focus on providing solutions rather than maintaining the complex infrastructure involved. And that’s just the beginning. TIBCO is working on a new technology that leverages graph databases to solve a set of previously impractical problems. I’m very excited to see what’s in store.