Self-Service Analytics for Finance and Accounting

Analyzing electronic document

Chances are, business intelligence is on every accountant’s Christmas list this year.

The energy industry of 2015 is shaping up to be a world of slim margins and fast turnaround to allow producers, distributors and marketers to maximize the effect of their smaller capital budgets. Accounting and finance departments will feel this pressure unlike any other. Between processing and paying invoices, regulatory reporting, and the myriad of one-off financial reports that executives need, every precious second counts in the day of a finance department.

In 2015, those seconds are even more precious. Invoices need to be paid faster, cash needs to be freed up quicker, and overrun costs need to be visible sooner.

So if you’re an IT manager or CIO trying to pick out a holiday gift for that certain special accounting department, consider a self-service analytics platform.

What is “Self-Service Analytics?”

Let’s consider the maturity of an organization based on how it uses data.

Self-service analytics is, like the name suggests, an ecosystem of empowered users accessing data from across several line-of-business systems in one place, doing analysis on that data using a robust tool and collaborating around the results in a central location. It’s the pinnacle of organizational data maturity, without delving into predictive and prescriptive analytics.

There are many challenges to implementing self-service analytics, which is why organizations turn to business intelligence consultants to craft a business intelligence strategy and implement it using industry best-practices. The technical challenges are real, but the biggest challenge is organizational. Changing the way users think about gathering data and how they use it on a day-to-day basis is a process that requires experience and careful planning.

For an individual, the transition from mashing up line-of-business reports in Excel to creating analyses and dashboards in a tool like Spotfire or Tableau requires three key things:

* Familiarity with the data

* Knowledge of the platform

* Confidence in the solution

Without those three things, self-service analytics will have a very hard time growing across an organization. In implementing a BI strategy with self-service analytics in mind, it is up to BI consultants to leverage an organization’s familiarity with the data while creating and training users on the platform and, eventually, building user confidence.

Finance and Accounting: Champions of Self-Service Analytics

Many folks in upstream energy consider geophysicists and production engineers to be the most data-savvy folks in the office, but I think that honor goes to accounting. They simply do so much with so little, and with an impact that can be felt across all other departments.

Accountants and financial analysts live each day surrounded by spreadsheets—static spreadsheets and grids of data. These people know both the data and the financial workflows of a business in and out. And they’re generally open to learning new technology platforms—especially platforms that might free them from mashing together Excel spreadsheets by hand.

Generally speaking, finance and accounting love analytics, hate the limitations of the tools they’re given and would like an easier way to do analysis. They can take a self-service analytics platform and turn it into real, noticeable ROI without the overhead of explaining how the data works.

The Impact of Self-Service Analytics

Let’s consider an example of how self-service analytics can transform a finance department:

Laurie is a financial analyst at a Houston, Texas independent oil and gas producer. She’s been tasked with tracking how vendor costs have changed over time. The COO also wants her to produce a model that can be used to aid in selecting new vendors for 2015. But even with access to the company’s ERP system and general ledger, Laurie still has to mash the data up by hand using reports that are only available as flat files delivered via e-mail.

Every month, Laurie (along with the rest of the accounting department) loses an entire workweek to booking accruals and closing the books for the accounting period. She’s also one of a few individuals who has access to both the general ledger and the ERP system, so the CFO and COO are constantly asking her to create Excel charts to be used in presentations to the board.

One day, early in 2015, she receives an e-mail asking her to attend a lunch-and-learn training delivered by Entrance about how to use Spotfire for financial analysis. She’s never heard of Entrance, but she has heard of Spotfire and she knew that a BI was “in the works” at her employer.

Laurie learns that Entrance has set up Spotfire Server as the new go-to place for data across the organization. Data from the ERP, general ledger, production accounting, and project management systems are all ready to use. Just having the information links published to Spotfire Server and available in one consolidated place cuts down on the time spent mashing the data by hand.

Then, she watches the consultant from Entrance create a new Spotfire analysis. He pulls together data from three systems, visualizes estimate costs and posted invoice amounts by service type over service dates, and has created the foundation for Laurie’s analysis. Each page allows her to drill down from BU level to supervisor to supplier. She can tag items for review by others. She can bookmark steps in her analysis and forward them to her peers and managers. Knowing the data, learning the tool, she gains confidence in the process and solutions.

Armed with her new analytics tool, Laurie is able to quickly build up a first pass at the analysis she was asked to create. She publishes the analysis to the server for a few other members of the finance team to analyze. They offer some suggestions and Laurie revises her analysis before sending a link to the CFO.

The CFO is very impressed. It’s clear that a few mud pump vendors are linked directly to higher maintenance costs, and so the CFO calls the COO into the office to learn more about Laurie’s methods. Pretty soon, her contemporaries across the organization are using her analysis as a basis for creating vendor cost reports.

Stories like Laurie’s play out across the energy industry every day, with empowered individuals uncovering new trends and building new insight with self-service analytics. Data integration and visualization are important parts of a business intelligence solution, sure, but self-service analytics is also an important part of the picture. It is a very exciting way to leverage the expertise and creativity of traditional data consumers to multiply the effect of business intelligence throughout an organization.

 

 

Why can’t you just tell me how much it’s going to cost?

If you’ve ever engaged a vendor on a software development project, the vendor probably gave you an “estimate.” You may ask, “Why can’t the vendor just give me a pricing schedule or a firm quote?”

Software development is a precise way of automating an imprecise process. It is very hard to use a programming language to translate business rules to a computer, because computers only understand 100% true and 100% false. A business process that only happens “sometimes” quite literally does not compute. If you can describe the rules that govern the process of your business in 100% of scenarios in 100% true/false language (no “maybe” allowed) then it would be theoretically possible to precisely calculate the effort involved in a software project (on a related note, the likelihood of having any bugs would also get pretty close to zero).

Another issue software vendors face when we put together a project estimate is a modern world in which the way you run your business is constantly changing. We’re all fighting a battle just to keep up with changes in the market, staff turnover/competencies, and evolving business processes to maximize operational efficiency. Software projects take time to implement, and it’s increasingly likely that a requirement that made sense at the outset of the project may no longer make sense a few months later when the application is deployed to production.

To combat the challenge of change, Entrance employs an agile software development methodology that allows the software requirements to morph over time as the needs of the business evolve.

The “iron triangle” of project management is well established: scope, schedule and budget (as the saying goes, “pick two”). Entrance typically engages with clients by fixing the schedule and budget, which allows the scope to adapt.

To set schedule and budget for a project, Entrance identifies the make-up of the team needed to achieve the project objectives. By identifying the team in advance, Entrance can predict the project “burn rate” with a high degree of accuracy. If we know how many people are working for how long, then schedule and budget are known.

In addition to setting a budget, Entrance recommends establishing a project contingency budget as well. Typically, we recommend a 20% contingency. The contingency may or may not get used on a particular project but it is not uncommon for a project to go into the contingency budget. If your vendor gives you one number for an estimate (i.e., no contingency or no budget range), then it is advisable for you to add your own 20% contingency for budget planning purposes. Frequently, contingency budget is used on agile projects to address additional scope that may not have been originally anticipated. However, the project sponsor ultimately controls whether or not the value proposition exists to justify moving forward.

By staying engaged with a product owner and project sponsor via semimonthly iterations in our agile process, we have achieved great success in meeting objectives without having a fixed scope up-front. In the agile process, contrary to what you may think, we actually do more planning than in a traditional “waterfall” (all requirements identified up-front) approach. Because we work in time-boxed iterations, you are able to see progress regularly and make small course corrections as we go. By allowing the scope to adapt to your needs we you in the driver’s seat to ensure that the implemented functionality meets your most current business requirements.

 

5 Key Takeaways from TIBCO NOW

The Entrance BI Practice was well-represented at this year’s TIBCO NOW conference, which featured keynote presentations by disruptive thought leaders like VC legend Marc Andreessen, TIBCO Founder and CEO Vivek Ranadive and bestselling author Malcolm Gladwell.

Five things stand out in my mind as I look back upon the week so far:

1. TIBCO Jaspersoft is bringing BI to the masses

· I have to admit, my experience as a BI consultant building self-service analysis tools on the Spotfire platform left me wondering if there is any room for “pixel-perfect” reporting in the BI landscape. TIBCO addressed these concerns at the conference by clearly demonstrating how Jaspersoft will play a very important role as BI expands from the top of an organization down to the users who have traditionally been on the “fringe.” Jaspersoft empowers developers to drop reporting and BI controls directly into their applications–just like the graphing controls of old–but with the added bonus of flexibility. End users can dive into the data, change the presentation of a visualization, and generate reports for day-to-day use. This is huge. “Data democracy” is coming, and TIBCO is leading the revolution with Jaspersoft.

2. Centers of Excellence are a force multiplier in business intelligence

· Business intelligence consultants such as myself have an important role to play in demonstrating best practices, developing solutions and teaching organizations how to start taking ownership of their data. But a truly successful solution features a community within the organization itself: A Center of Excellence. These power users see the value of data and know how to help grow BI in every direction.

3. Data isn’t just big, it’s fast

· Many of TIBCO’s breakout sessions this week focused on the rise of “fast data” in a truly real-time sense. This has implications throughout every enterprise, and oil and gas is no different. Imagine being able to predict pump failures before they occur, and show deviations from normal operation to an engineer who can act before maintenance turns into replacement. TIBCO’s StreamBase working in concert with Spotfire gives upstream the power to create models and apply them to live data. That’s the power of fast data and real-time event processing.

4. Integration isn’t a luxury, it’s a must

· Business processes and the applications that drive them generate a wealth of data that cannot be turned into actionable information without integration. Whether you’re a multinational player in energy or an independent focused on the next big asset play, making the right decisions requires input from every department working in concert.

Making those decisions in a timely manner demands integration from all departments. And master data management doesn’t hurt, either!

5. Big things are on the horizon

· Sometimes, you need to turn humongous swaths of data into actionable information– we’re talking about terabytes and petabytes that simply won’t fit into a traditional in-memory database. IT has the power to make this a reality today, by leveraging the power of the cloud in concert with its on-prem assets. Platforms such as Apache Spark and NoSQL databases like MongoDB are empowering organizations to operate on this sort of truly big data. But leaving the data in the cloud (say, in Microsoft Azure) allows IT to focus on providing solutions rather than maintaining the complex infrastructure involved. And that’s just the beginning. TIBCO is working on a new technology that leverages graph databases to solve a set of previously impractical problems. I’m very excited to see what’s in store.

 

 

Entrance Experience: Day 1 at TIBCO NOW

tibconow[1]

 

When it comes to Data Visualization, there is no bigger or more significant gathering than the TIBCO NOW annual event. Whether you are a partner, user, or just a data scientist, this was the place to be for the latest TIBCO Spotfire product developments and usage best practices. As a new TIBCO silver partner, Entrance was extremely excited to be attending for the first time.

Day one did not disappoint with an outstanding line-up of keynote speakers.

The theme of the event for 2015 is Your World is Being Disrupted. They delivered against this theme with a number of great presentations with tips for winning in the age of technology and staying ahead of the competition.

TIBCO CEO Vivek Ranadivé kicked off the first day with morning keynote and brief address by San Francisco’s mayor, Ed Lee about how San Francisco is embracing technology. After the mayor left, he later addressed his belief that we are in the era he calls Civilization 3.0. Civilization 1.0 was the pre-industrial revolution era, the agrarian revolution. It was the age of the artisan. People were individual contributors, farmers, shopkeepers, carpenters, painters, and artisans. The industrial revolution brought us Civilization 2.0, which ushered in the age of the corporation, beginning with Ford’s assembly line in 1913 with the main focusing being to organize people and systems for efficiency.

Vivek says we are now entering Civilization 3.0 and the availability of platforms where individuals can reach large audiences, The emphasis shifts back to the value creators. Civilization 3.0 is about offering a service that delivers extreme value to its constituents.

In Civilization 3.0 there are five forces powering the movement including:

  1. The explosion of data
  2. The rise of mobility
  3. The emergence of platforms that allow you to reach a bigger audience for example Twitter
  4. The rise of Asian economies
  5. The understanding of how math is trumping science

These forces drive us toward the age of service and thinking of it constituting as a social network and all information being perishable. However with the age of Civilization 3.0 Ranadivé predicts that in 15 years computer programming will be the solution to curing diseases and perhaps predicting certain diseases or heart attack.

Data visualization tools such as TIBCO Spotfire allow companies better collaborate internally and externally, as well as drive greater shareholder and investor value. When we implement such tools, we are empowering our clients with Civilization 3.0 capabilities – which including being able to analyze data for signals, compare new data against past data, visualize data and finally instrumenting the insights gleaned into competitive advantage.

As a company committed to delivering software that enriches people’s lives, we loved Marc Andreesen’s keynote which really more of a conversation about innovation and how every business is “becoming” a software company. As the inventor of Mosaic, the first graphical web browser, co-founder of Netscape and as a venture capitalist backing an array of web companies including Twitter, Skype, and Instagram to name a few, Marc makes that point that we need to assume that customers are connected 24×7 with mobile devices. This provides a great opportunity for allowing people to order products, services, software through the internet at any time. This also presents companies with a great opportunity for advancement and innovation if they stay ahead of the curve. With this shift of every business, marc see every company becoming a software company. Marc foresees that in the next 10 to 20 years the healthcare and education industry are going to change the most, as they are vital to people, which will require the need for new software. In closing similar to Vivek Ranadivé, Marc believes strongly in teamwork and that only when people work together and in an environment that encourages risk will companies see exceptional progress.

After a few more speakers, they played a pre-recorded interview that Ranadivé did with PepsiCo CEO Indra Nooyi keeping with the theme of people and technology stating the importance of change. Named CEO in 2006 and included five years in a row on Fortune magazine’s list of 50 Most Powerful Women in Business, Indra Nooyi’s believes that you should put your whole self into the job, which helps businesses change and evolve. The world is getting smaller with technology and the growth of the world beyond Western Europe and United States making it a requirement for companies to transform and increase the transparency even at a loss to privacy to engage with their customers. Sometimes to get the message across to their customers a company must tell their story repeatedly to win over their prospects. Companies need to create return and value to their customer to succeed. It is vital for a company to not only improve but breakaway from the status quo and the leader’s role should envision the future as well as making the vision attainable.

Overall, a great line-up of very inspiring speakers and a great way to kick of the event. We look forward to posting more insights from our experiences on day 2 and 3.

 

 

Conference Recap: PIDX Fall 2014

PIDXlogoStandards as Value Creators in Oil and Gas

I attended the PIDX International US Fall Conference last week in Houston where the theme was “How Standards Accelerate the Realization of Value in Oil & Gas.”  A variety of speakers gave presentations around this topic by presenting real world examples, including Entrance’s President Nate Richards.  I want to highlight some of the key messages that I saw coming out of this gathering of industry experts.


 

Be Proactive Not Reactive

In a joint session, Noble Energy presented with ADP on the ways they use standards to drive global efficiency in their e-invoicing handling.  The message that stood out most to me was that they were being proactive, not reactive, thanks to e-invoicing.  In other words, by leveraging PIDX standards, Noble is able to address issues before they become problems rather than constantly chasing down errors and answers to questions.  It lets Noble ask the questions instead of being the recipient of questions.  The ability to be proactive drives efficiency by reducing time spent on things such as follow-up and re-work.

The Time is Now

A common theme that presented itself across many presentations was the idea that “the time is now” for e-business automation.  The standards are well-established and the technology is robust.  There is a sense that critical mass is being reached as leaner businesses squeeze out inefficiencies in their back offices.  Businesses that are not automating are quickly falling behind the curve as time marches on.  Ultimately, businesses that ignore the momentum toward the automation of trading partner interactions will put themselves at a competitive disadvantage as electronic business transactions become “table stakes” for businesses across the entire oil and gas industry.

Common Business Challenges

As part of a panel discussion, Deloitte elaborated on the value of standards in the oil and gas industry.  Deloitte’s presentation made an important point about the business challenges facing the oil and gas industry.  Specifically, the Deloitte representative highlighted the following challenges:

  • Mergers and Acquisitions
  • Rapid Growth
  • Lack of Integration
  • Technology

The speed of business is constantly accelerating and nowhere is that true more than oil and gas right now.  Data exchange standards provide a common language with which to drive process efficiency which ultimately facilitates M&A and enables rapid growth.  Lack of integration is a historical challenge but this is a clearable hurdle now for those companies willing to make an investment in their future efficiency.  Technology is constantly changing and larger organizations may struggle with their internal IT groups to get the right tools to meet their urgent business needs.

Not Just the Standard Schema

On the technical side, it all comes down to actually making everything work at the end of day.  While standards like PIDX provide a standard schema to facilitate communication between businesses, there is still a significant challenge to be overcome around semantics.  The PIDX standard provides a framework but ultimately each implementer uses the standard in a way that makes sense to it.  There is still much more to be done around defining the meaning of terms.  For example, there is consistently disagreement between organizations and individuals over the definition of what is a “well,” which is such a fundamental aspect of any oil and gas data discussion.  (PPDM has done a lot of work on this particular example, but challenges still remain across the lexicon of oil and gas.)

What’s next?

For businesses in the oil and gas industry looking to drive efficiency in the back office, e-business automation is a proven tool.  If you are interested in learning more how to reduce manual processes, eliminate paper, decrease days sales outstanding (DSO) and drive down costs, then it’s time to talk to Entrance consultants about creating a vision and road map for your company’s software strategy that leverages standards and the latest technology to enable an efficient back office.

 

There Will Be Bugs! | Application Development Blog Series

Best practices for minimizing the impact and mitigating costs associated with fixing software bugs

 Fixing software bugs
First of all, let’s clear something up: what is a bug? A bug is a flaw in software that results in the application not functioning as intended, and is something that is theoretically preventable.

There will be bugs in all of the software applications that you use in your business, whether you custom develop them or buy them “off the shelf.”  Just let that sink in for a moment.

Changes, enhancements, or new features will appear periodically due to changes in the needs of the business, but they are not bugs.  A change (or enhancement) is a result of a new requirement that was not known up-front.  Frequently, owners of custom applications will feel like they have a “bug” because certain business rules are not applied in the desired manner, however these are often changes or undocumented requirements (i.e., not bugs).

Bugs are an inevitable aspect of any software development project, as all software is created by imperfect humans.  There are a variety of techniques that development firms like Entrance use in the application development practice to detect and eliminate bugs before they go to the production environment which helps minimize and mitigate the impact both to end users and the project timeline:

  1. Well-written user stories with acceptance criteria – The most important step in preventing bugs is understanding what the application should do before any code is written.  At Entrance, we create a “user story” format to capture detailed requirements as part of our agile project methodology.  Additionally, it is crucial to capture acceptance criteria so that there is no ambiguity about the desired outcome of the feature.
     
  2. Automated testing – Automated testing is one piece of software testing another piece of software.  It increases the efficiency of the quality assurance process while mitigating the likelihood of regression bugs.  In the same way that business software can make your office more efficient, automated testing allows basic component testing to be performed frequently and very quickly.
     
  3. Quality assurance review – The most fundamental aspect of assuring quality is testing the software by a specifically trained individual who is dedicated to finding bugs before your users do.
     
  4. User acceptance testing (UAT) – The final checkpoint for quality assurance is testing by actual users.  Entrance helps our clients through the UAT process through the creation of the user acceptance criteria and by facilitating the execution of the user testing process.
     

Entrance uses industry-standard tools such as Microsoft’s Team Foundation Server (TFS) to track and manage bugs found in applications that we develop. By tracking bugs in a detailed manner, we can calculate quality metrics and monitor our portfolio of application development projects.  Quality metrics allow us to identify trends, norms, and abnormalities so that we are able to keep all of our projects on track.

The cost of remediating bugs is addressed differently between off-the-shelf and custom applications.  In the Commercial Off-the-Shelf (COTS) environment, you either get “no warranty” (as-is) or you pay an annual “maintenance fee” (usually in the neighborhood of 20% of the software’s cost).  If you’re paying a “maintenance fee” then you’re pre-paying for bug fixes (think of it as bug insurance).  In the custom development world, as the application owner you pay for bugs but the cost is typically recognized incrementally as the bug appear over time.

There are different ways to manage the cost of remediating bugs that make it through to production.

  1. Warranty – Warranties are not common in the software world due to the complex nature of business applications.  Custom software may be warrantied in certain situations, most commonly in a fixed-fee project where the vendor agrees to implement specific functionally for a firm price.  A warranty might also be offered for a separate fee as an option.  If a warranty is offered, expect the vendor to be very particular about what is or is not covered by the agreement.
     
  2. Time and materials – In a time and materials scenario, the software owner will engage the vendor after a defect is identified and will be subject to the availability of the vendor at the time service is requested.  This option exposes the software owner to the most risk and is generally only advisable for software that is plainly not mission critical.
     
  3. Retainer – Retainers tend to offer the best balance of cost and risk mitigation for most software owners.  A retainer relationship with a vendor guarantees the availability of the vendor up to a certain number of hours (typically per month) for a fixed fee and may provide a discount over base pricing when pre-committing to a sizeable volume.  Additional hours beyond the monthly hours included in the retainer fee are typically offered on a time and materials basis subject to the availability of the vendor.  The main advantage of a retainer is that you can be assured that the vendor will be available to address any business critical issues that may arise in a given month.  Depending on the structure of the retainer, hours not used for bug fixes or support may be available for other efforts such as enhancements.  Prepaid hours do not roll-over to the next month because the vendor has already committed resources in advance.
     

 

How custom software ages | Application Development Video Series, Episode 1

Not unlike hardware, custom-built software applications can age over time. There are a variety of ways that a business can outgrow a software application, and those aging applications can become more burdensome to maintain than to update or rewrite.

We don’t typically think of software as something that can age. The code is written and run on a computer, so it doesn’t age like hardware does or like people do. It becomes obsolete or becomes less functional, unable to keep up with the demands of the business and the business users. Software is usually written based on a snapshot of a business need for a certain period or range of time. Unfortunately, software doesn’t change dynamically the way the business does. Businesses can change, market demands can change, the number of users can change, and the software can have trouble keeping up with that.

Another aspect of aging is that the software may have been built in a technology that is no longer supported or is slowly on its way out the door; it’s being deprecated or replaced by more modern design patterns and technologies. The software was written years ago, and the technical resources are no longer available or difficult to find. When you can find them, they are expensive, which makes maintaining the software more and more costly.

Technologies, design patterns, and understanding of software as a functional piece of a business were limited 10-15 year ago, and that technology continues to evolve. When we think about legacy applications, they were monolithic in nature and written top-to-bottom; every line of code was executed in one lump sum. To change one little thing in those applications, you had to change everything.  Thankfully, now we have better paradigms alongside better technologies where we can separate the different pieces of functionality and objectives into multiple layers.

  • We can have part of the application that is written specifically to manage the database.
  • We can have another piece that manages business rules and validation.
  • We can have another piece that’s a service layer that allows you to integrate other systems and software, preserving the code that’s already in place for business logic and the database.
  • We also have the user interface and front end of the database. This part is also changing: it used to be just PC-based, but now you’re going to want to think about new devices like GPS, tablets, and cell phones so people can access your software anywhere in the world.

We begin to realize there is an aging process that happens with software — as it ages, it becomes more difficult and expensive to maintain in addition to some of the lost opportunities for growth. For instance, older software wasn’t designed to take advantage of the hardware that you’re probably already using which has multiple core processors and robust memory capabilities. Bringing the software up to date will give you the opportunity to take advantage of those hardware options for better performance.

Software Modernization: When Is It Time?

Increase The Scalability And Efficiency Of Your Oilfield Services With Field Data Capture Software

Oilfield Workers on Laptop iiDon’t let your current time-keeping and work-order processes restrict your growth potential. Streamline and automate these critical tasks with field data capture software.

With the US onshore rig count level holding at ~1,710, 2014 is shaping up to be another banner year for North American oilfield services companies. Are you capturing your share of this business?

Most oilfield professionals we engage with are coming to the realization that they’re leaving money on the table.  Their growth is being stunted or unrealized due to antiquated processes and systems.   The data capture tools and processes currently in the field are simply no longer adequate to meet their future growth plans and the elevated expectations of their clients.

In this blog, I’ll address how using paper-based forms for time-keeping and work-orders in the field – while familiar and convenient – slow down time-sensitive workflows, hinder reporting, and create data accuracy and integrity issues that negatively impact the payroll and invoicing work streams.  I will also address how a Field Data Capture solution solves these challenges.

Poor data quality – the scourge of Finance and AR

Do you rely on hours-worked documentation from field operations staff to drive invoicing and payroll? Our services industry clients do, so we know the pain of managing invoicing and payroll when that data is missing, incomplete, inaccurate, or untimely.  The impact of poor data quality for hours-worked ripples throughout the organization from the CFO’s concern for Days Sales Outstanding (DSO) to the Accounts Receivable and Payroll clerks that are striving to do their jobs accurately and efficiently.  Electronic field data capture can add scalability and efficiency in the back office AR and Payroll departments.

Five critical questions for AR and payroll teams

AR and Payroll staff must always be on the ball due to the cyclical rhythm of the processes they own in the business.  Whether your processes execute as frequently as weekly or as infrequently as monthly, the obligations of these departments do not change.  As a services firm, AR and Payroll are arguably two of the most critical back office functions.  These questions will help you assesthe effectiveness of your own processes:

  • Does your data capture process  differentiate between hours billed and hours paid?  In order to ensure accuracy in the back office activities, it is critical that your system of record clearly distinguish between hours to be invoiced and hours to be paid (especially for hourly employees).

 

  • How long does it take for your AR/payroll departments to get notified once work is performed? Is it weeks, days, hours, minutes or seconds?  Yes, “seconds” is a possible answer, if you have a mobile-enabled electronic field data capture system in place.

 

  • How long does your back office take to process invoices or payroll once they’ve received time records? e time records until invoices or payroll are processed?  How much time does your back office staff spend performing data entry, rekeying into multiple systems or manually creating Excel files?  Automation of the static and repetitive invoice generation and payroll processing functions can make your back office staff much more efficient so they can get more done in less time.  In our experience, automation solutions never replace jobs but rather they let the existing staff be more effective.  The efficiency created by automation reduces the need to hire additional staff as the business scales up.  Additionally, an often overlooked benefit is that automation keeps the staff you do have much happier with their jobs.  The productivity increases enabled by automation allow humans to focus on the non-automatable processes that naturally occur all the time that cannot be automated away.

 

  • How often do  you receive complaints about inaccurate invoices or payroll inaccuracies?  Manual human processes are naturally error-prone even for the most diligent clerk.  Automating manual processes means that you set the business rules one-time, up-front and don’t have to worry about the process or math being done the same every cycle.

 

  • Is there one critical person in your business process that controls everything?  If you’re AR or payroll clerk goes on vacation, gets sick or retires, are your business processes able to execute without interruption?  Process automation reduces the risk to the organization by allowing the business to continue executing as usual even when the personnel change.

 

Attend our free seminar to learn more!

Hopefully, I’ve convinced you that arming your field workers with new data capture software and mobile devices will dramatically improve information flow between the field and your back-office, resulting in more efficient and scalable processes that enable rather than hinder your workforce from supporting more clients and work.  While I focused specifically on time keeping and work orders, a robust Field Data Capture solution also provides similar benefits when creating inspection reports and asset integrity management.

If you are interested in learning more about how to jump start your own Field Data Capture improvement initiative and are already starting to consider whether a custom and/or packaged data capture solution is the right approach, I highly recommend that you attend our FREE lunch seminar at III Forks on July 17th where Nate Richards will talk provide an overview on Field Data Capture Solutions for Oilfield services.

LEARN MORE

GETTING STARTED VIDEO:  ELECTRONIC FIELD TICKETING

CASE STUDY: NA PIPELINE SERVICES COMPANY INCREASES OPERATIONAL VISIBILITY

SOLUTION PAGE: ELECTRONIC FIELD TICKETING CONNECTS FIELD TO CORNER OFFICE

3 Reasons Why Data Management Should Be Strategic Rather Than Tactical

Global Business Communication

During the 18th International Conference on Petroleum Data Integration, Information and Data Management (commonly known as the “PNEC” conference) on May 20-22, 2014, there was a session in the first day dedicated to the topic of professional data management. During the panel discussion, an attendee asked the following question: Why do we even need a professional role dedicated to managing data, since data service is a supportive role to various operations?

Trudy Curtis, CEO of PPDM Association, answered this by emphasizing that data management (especially data content management) should not be viewed as a tactical function, but a strategic one, needing lots of focus and planning to help businesses truly benefit from the potential of strong, quality data.

Many businesses indeed do not view data management as a strategic function. Below, I will give three reasons why data management deserves to be a key strategic function within any modern digital business.

Data is the blood of modern business workflow

When considering business processes, or workflows, of a business, many would consider them comprising mainly operation procedures. For many of these workflows, the role of data is supportive (i.e. they are the inputs and artifacts of the workflow, but the data itself would not alter how the workflow is run). Enter the digital age, though, and you suddenly have important workflows that cannot run without putting data management into a much more strategic, proactive role.

Suppose that an upstream company manages leases where, upon a certain level of production from the wells of the land, division of interest changes. For example, when the company produces X barrels of crude oil, a particular division of interest doubles (while proportionally shrinking interests of other entities). The data about the leases and the data of production accounting are stored in separate places. In this case, two challenges would occur:

  • If the production level data is not accurate (data quality issue), it may trigger the change of division of interest at the wrong level, or not trigger at the right level. This will bring losses to the company, and/or damage relationships with customers.
  • If the production level data is not accessible from the lease management department (data accessibility), then the whole workflow completely relies on someone at the accounting department to notify the land department in order to make the necessary change. Not only is this cumbersome, but the probability of missing the change notice is very high.

As you can see, today’s workflows are increasingly dependent upon data quality, accessibility, governance, etc. to ensure the execution quality of the process. To minimize negative impact due to data issues, data management needs to be done at a strategic level, so that it can plan forward and ensure that all processes in the company are well supported by the needed data. If there is no plan, when you need it, it will not be there.

Unplanned data cannot give meaningful information

One wave that the industry is catching on is Business Intelligence (BI). By utilizing data integration, dashboard, data warehouse, etc., it provides a powerful platform to generate useful information, helping the business line to make better decisions. There is, though, not enough discussion about the underlying requirement: data quality.

Simply put, the data needs to be a certain quality to support BI objectives. One common challenge is that in order to do a useful rollup of a certain dataset, there will be certain required data often not captured. BI projects rely on well-captured data to be successful and useful; if the data has not been captured, a BI project will not miraculously fix this problem.

As the saying goes: “garbage in, garbage out.” BI projects also rely on data that is in good quality, with accurate and precise data to do correct rollups, so that it can provide adequate and realistic information. In fact, the most costly portion of many BI projects is data cleansing, which is required to make the projects successful.

If the data has already been managed strategically, ensuring certain quality, governance and availability, projects and operations that rely on this data will be much more cost efficient and successful.

Data maturity needs to grow with the business itself

Many people talk about data growth in terms of volume. Data volume is certainly a key factor, but it would be unwise to overlook the fact that data maturity needs to grow with the business itself as well. It will not magically catch up with the business, and ignoring it in the business roadmap can lead to negative impacts.

Realistically speaking, setting up a mature data model and strategy is costly and time-consuming. For small businesses, they need quick wins to maintain positive cash flow; therefore, most small businesses could not afford high data maturity, and “getting the job done” is what they focus on.

As the organization grows, though, the data has to become more mature with the organization. Since the business requirement will expand, or even become different, when the organization grows, the original data model, quality, governance, etc. will not be able to support the growing operations.

Projects to improve data quality, set up governance, ensure accessibility, etc. are expensive and time-consuming, therefore these data improvement projects need to be planned ahead, in accordance with the organization’s roadmap.

Moving forward

Just like IT, data management used to be viewed under the spotlight of a supportive, tactical function. However, in this new digital age, data management deserves better management. Align data management with your company strategy roadmap, and your organization will have a head start to quality data, ensuring operation efficiency and cost savings in the long run.

The Importance of Aligning Business Units and IT Department

During the PNEC conference on May 20-22, 2014, there were multiple presentations showcasing their various level of success in IT and data management projects. One key theme that kept appearing was the topic of “aligning business units with IT departments” for a joint effort of implementation. However, for most of the presentations, there was no further explanation of how to make this happen.

This is an epidemic among multiple industries, but it is particularly severe in the energy space. For many organizations, IT departments work in silos, and business units do not know how to manage them.

The traditional role of IT department

Even though technology has evolved tremendously since its inception into corporation, the way of managing IT departments has not changed significantly. In many organizations, the main responsibility of the IT department is to support various operations, and the role of IT is usually tactical, not strategic.

These IT departments are usually a monolithic structure. By and large, the IT department is usually a distinctive sub-organization within a corporation. With this organizational structure, IT usually operates in a silo, and out of sync with the organization’s development. While IT needs to support multiple business units, it does not blend well with any one of them, since it needs special purpose and required talents.

As business units normally do not have a solid understanding of how IT works, the IT department is usually a mystery to the whole corporation. Operating under the impression of a black box (even worse if any part, or the whole set, of IT functions is outsourced), there is no measurement of how aligned the IT department is to the whole business, as long as the bare minimum of digital functions seem to work for the most part.

There are many articles on the internet talking about how to manage IT departments in new ways, for example decentralizing the monolithic IT department into smaller units, spread into different cross-functional teams. This may be an enormous task to many big corporations. However, we can look at managing IT projects better by discerning their different purposes and functions.

Organize IT Projects by different purposes

There is definitely no single way to categorize IT projects. Here, I would try to put IT projects under four categories by purpose, for better clarify and ease of management:

Digital Fabric: the infrastructure

This is the backbone of every IT project and all digital capabilities. These types of projects include everything from hardware to networking to server software to technology toolkits. Their main goal is to provide the organization with the best capability for developing its digital assets. The talents championing in this area are not necessarily good at facing end users; rather, they need the best technical excellence to ensure a robust system that is easy for other development teams to develop upon in a development-friendly manner, and with the best availability (uptime).

Common Services: cost- and operation-efficiency focus

IT departments have an inherit function to support the end users in various ways. Many of them are quite commonplace; they are about ensuring operation efficiency, reducing cost, and giving users (internal and external) non-invasive ways to continue their day-to-day operations. Implementing file-sharing services and shopping carts are some of the examples. Talents under this area excel in focusing on what the users want, user interface design, and serviceability.

Enterprise Solutions: extension of business capabilities

The truly interesting IT projects are usually not those focusing on cost reduction; instead, the most interesting IT projects belong to this category. The end goal for these enterprise projects is to extend and expand the company’s capability, in order to generate revenue and to gain competitive advantage. Enterprise software projects, custom application developments, master data management, etc. are all under this category. These are also the types of projects that get deeply embedded into business processes; once implemented, the solution would greatly enhance the process’s capability to achieve goals that are much more efficient and capable than pure human resources. The cost, however, is that once these solutions are so embedded into processes, they are not easily swappable; and the project duration tends to be longer. That’s why these projects need to be championed by someone who excels in both the technical expertise, as well as the specific business operations, such as business analysts. Enterprise solution projects require the most strategic attention in planning and resource management.

Analytics: the cutting-edge insights and researches

These projects help business make better decisions. Information and insights are the keys. Business Intelligence (BI) is one example. These types of projects are usually quite cutting-edge, experimenting and discovering new ways of looking at data and information to help business gain new insights, make better decisions, and discover new revenues. People who excel in this category are fearless and enthusiastic innovators.

 

How to align business unit and IT departments

As the above shows, even though traditional IT departments are monolithic organizations, they actually handle various types of projects that have different goals, and require different kinds of talents. One of the biggest hurdles for business units dealing with IT projects is about the wrong project-talent match: the project may be about cost reduction, and IT professionals who are more enthusiastic about innovation may find it boring. Another hurdle is that business units typically manage custom solution projects in the way of common service, not realizing the timeline, resource and strategic attention the project requires.

Entrance has extensive experience in custom solution development and BI projects. We have a proven track record of implementing solutions that extend business capabilities of clients to unprecedented levels, unlocking competitive advantage and preparing for the organization’s next wave of growth. We understand how to tie solutions to business processes for optimal efficiencies, as well as helping business owners make the best decisions with our BI expertise.