There Will Be Bugs! | Application Development Blog Series

Best practices for minimizing the impact and mitigating costs associated with fixing software bugs

 Fixing software bugs
First of all, let’s clear something up: what is a bug? A bug is a flaw in software that results in the application not functioning as intended, and is something that is theoretically preventable.

There will be bugs in all of the software applications that you use in your business, whether you custom develop them or buy them “off the shelf.”  Just let that sink in for a moment.

Changes, enhancements, or new features will appear periodically due to changes in the needs of the business, but they are not bugs.  A change (or enhancement) is a result of a new requirement that was not known up-front.  Frequently, owners of custom applications will feel like they have a “bug” because certain business rules are not applied in the desired manner, however these are often changes or undocumented requirements (i.e., not bugs).

Bugs are an inevitable aspect of any software development project, as all software is created by imperfect humans.  There are a variety of techniques that development firms like Entrance use in the application development practice to detect and eliminate bugs before they go to the production environment which helps minimize and mitigate the impact both to end users and the project timeline:

  1. Well-written user stories with acceptance criteria – The most important step in preventing bugs is understanding what the application should do before any code is written.  At Entrance, we create a “user story” format to capture detailed requirements as part of our agile project methodology.  Additionally, it is crucial to capture acceptance criteria so that there is no ambiguity about the desired outcome of the feature.
     
  2. Automated testing – Automated testing is one piece of software testing another piece of software.  It increases the efficiency of the quality assurance process while mitigating the likelihood of regression bugs.  In the same way that business software can make your office more efficient, automated testing allows basic component testing to be performed frequently and very quickly.
     
  3. Quality assurance review – The most fundamental aspect of assuring quality is testing the software by a specifically trained individual who is dedicated to finding bugs before your users do.
     
  4. User acceptance testing (UAT) – The final checkpoint for quality assurance is testing by actual users.  Entrance helps our clients through the UAT process through the creation of the user acceptance criteria and by facilitating the execution of the user testing process.
     

Entrance uses industry-standard tools such as Microsoft’s Team Foundation Server (TFS) to track and manage bugs found in applications that we develop. By tracking bugs in a detailed manner, we can calculate quality metrics and monitor our portfolio of application development projects.  Quality metrics allow us to identify trends, norms, and abnormalities so that we are able to keep all of our projects on track.

The cost of remediating bugs is addressed differently between off-the-shelf and custom applications.  In the Commercial Off-the-Shelf (COTS) environment, you either get “no warranty” (as-is) or you pay an annual “maintenance fee” (usually in the neighborhood of 20% of the software’s cost).  If you’re paying a “maintenance fee” then you’re pre-paying for bug fixes (think of it as bug insurance).  In the custom development world, as the application owner you pay for bugs but the cost is typically recognized incrementally as the bug appear over time.

There are different ways to manage the cost of remediating bugs that make it through to production.

  1. Warranty – Warranties are not common in the software world due to the complex nature of business applications.  Custom software may be warrantied in certain situations, most commonly in a fixed-fee project where the vendor agrees to implement specific functionally for a firm price.  A warranty might also be offered for a separate fee as an option.  If a warranty is offered, expect the vendor to be very particular about what is or is not covered by the agreement.
     
  2. Time and materials – In a time and materials scenario, the software owner will engage the vendor after a defect is identified and will be subject to the availability of the vendor at the time service is requested.  This option exposes the software owner to the most risk and is generally only advisable for software that is plainly not mission critical.
     
  3. Retainer – Retainers tend to offer the best balance of cost and risk mitigation for most software owners.  A retainer relationship with a vendor guarantees the availability of the vendor up to a certain number of hours (typically per month) for a fixed fee and may provide a discount over base pricing when pre-committing to a sizeable volume.  Additional hours beyond the monthly hours included in the retainer fee are typically offered on a time and materials basis subject to the availability of the vendor.  The main advantage of a retainer is that you can be assured that the vendor will be available to address any business critical issues that may arise in a given month.  Depending on the structure of the retainer, hours not used for bug fixes or support may be available for other efforts such as enhancements.  Prepaid hours do not roll-over to the next month because the vendor has already committed resources in advance.
     

 

How custom software ages | Application Development Video Series, Episode 1

Not unlike hardware, custom-built software applications can age over time. There are a variety of ways that a business can outgrow a software application, and those aging applications can become more burdensome to maintain than to update or rewrite.

We don’t typically think of software as something that can age. The code is written and run on a computer, so it doesn’t age like hardware does or like people do. It becomes obsolete or becomes less functional, unable to keep up with the demands of the business and the business users. Software is usually written based on a snapshot of a business need for a certain period or range of time. Unfortunately, software doesn’t change dynamically the way the business does. Businesses can change, market demands can change, the number of users can change, and the software can have trouble keeping up with that.

Another aspect of aging is that the software may have been built in a technology that is no longer supported or is slowly on its way out the door; it’s being deprecated or replaced by more modern design patterns and technologies. The software was written years ago, and the technical resources are no longer available or difficult to find. When you can find them, they are expensive, which makes maintaining the software more and more costly.

Technologies, design patterns, and understanding of software as a functional piece of a business were limited 10-15 year ago, and that technology continues to evolve. When we think about legacy applications, they were monolithic in nature and written top-to-bottom; every line of code was executed in one lump sum. To change one little thing in those applications, you had to change everything.  Thankfully, now we have better paradigms alongside better technologies where we can separate the different pieces of functionality and objectives into multiple layers.

  • We can have part of the application that is written specifically to manage the database.
  • We can have another piece that manages business rules and validation.
  • We can have another piece that’s a service layer that allows you to integrate other systems and software, preserving the code that’s already in place for business logic and the database.
  • We also have the user interface and front end of the database. This part is also changing: it used to be just PC-based, but now you’re going to want to think about new devices like GPS, tablets, and cell phones so people can access your software anywhere in the world.

We begin to realize there is an aging process that happens with software — as it ages, it becomes more difficult and expensive to maintain in addition to some of the lost opportunities for growth. For instance, older software wasn’t designed to take advantage of the hardware that you’re probably already using which has multiple core processors and robust memory capabilities. Bringing the software up to date will give you the opportunity to take advantage of those hardware options for better performance.

Software Modernization: When Is It Time?

Field Data Capture and Disaster Recovery

In January, Entrance hosted a lunch and learn, “Field Data Capture for Oil and Gas Service Companies.” Entrance’s president, Nate, covered some of the key considerations that companies out in field should cover before implementation.

During the Q&A portion, one of the big topics of conversation was around toughbooks. Field work is characteristically rough, so choosing and maintaining the right devices can be difficult.

Generally, a basic laptop is not sufficient for handling fieldwork. In this situation, many organizations weigh the costs and benefits of tablet versus a Toughbook.

Both can be a good fit, but considerations like cost, computing power, and size can all make a big difference in the final evaluation. In particular, organizations should be careful not to let the popular appeal of devices like the iPad to sway their overall evaluation of available devices.

Disaster Recovery

Regardless of the device your company decides to go with, a disaster recovery plan is important for maintaining productivity in the field. No matter how well-made your Toughbook, devices go down and get lost. Particularly in the case of a tablet, damage is consideration because they are easily breakable.

Without a recovery plan, your people can end up without a device for several days or more. The cost in lost productivity alone is enough to make this an unacceptable outcome.

The solution? Simple as it may sound, have back-ups on-hand that can quickly be deployed to your team. Be sure to address how to accomplish this even when workers are in very remote areas.

Choosing the right devices and having a plan for managing them is important for companies across the value chain, from upstream all the way to service. For more, watch the field data capture presentation from SlideShare. You can also find out about upcoming lunch and learns here.

Data Management for Oil & Gas: High Performance Computing

Data Management and Technology

The oil and gas industry is dealing with data management on a scale never seen before. One approach to quickly get at relevant data is with High Performance Computing (HPC).

HPC is dedicated to the analysis and display of very large amounts of data that needs to be processed rapidly for best use.

One application is the analysis of technical plays with complex folding. In order to understand the subsurface, three dimensional high definition images are required.

The effective use of HPC in unconventional oil and gas extraction is helping drive the frenetic pace of investment, growth and development that will provide international fuel reserves for the next 50 years. Oil and gas software supported by data intelligence drives productive unconventional operations.

Evolving Data Management Needs

As far back as 2008, the Microsoft High-Performance Computing Oil and Gas Industry Survey conducted by the Oil & Gas Journal Online Research Center indicated that many industry geoscientists and engineers have access to the computing performance levels they require.

However, computing needs are growing more complex, so significant room for improvement exists. NumerousOil and Gas: High Performance Computingrespondents believe that making HPC available to more people industry wide can increase production, enhance decision-making, reduce delays in drilling, and reduce the overall risk of oil and gas projects.

Chesapeake is the largest leasehold owner of Marcellus Shale Play, which reaches from Southern NY to West Virginia. They employ HPC  in their shales and tight sands operations.

3-D imaging enables technical staff to detect fine-scale fracturing and directional dependency characteristics. Seismic data provides a structural road map that helps identify dip changes, small faults and natural fracture orientation.

High Performance Computing in the Real World

Chesapeake routinely performs inversions of pre-stack and post-stack data management. Datasets for imaging and inversion support models that represent complex earth structures and physical parameters, where true inversion results are known.

Reservoir maps require constant updating. Advanced pre-stack 3-D techniques are used to extract detailed rock properties that aid in discriminating good rock from bad rock at Marcellus.

Focusing on pre-stack data management has significantly increased computational requirements. Depending on the acquisition method, collecting multicomponent 3-D data can increase data size by orders of magnitude.

Advanced algorithms provide results in a matter of days, making it possible to realistically deal with a lease schedule.

Clustered super-computing systems are becoming well priced and scalable. HPC options are not only realistic, but a requirement for independents who want to bring advanced processing capabilities in house.

Check out this blog post on how oil and gas companies are using data management to improve processes here…

Keys to Business Intelligence

Five key insights from business intelligence expert David Loshin

In a recent interview, David Loshin, president of business intelligence consultancy Knowledge Integrity, Inc., Business  Intelligence Implementation named five key things organizations can do to promote business intelligence success:

  • Design configurable business intelligence dashboards that can provide needed metrics in real time
  • Provide drill-down capabilities for metrics that are of specific concern for the business
  • Ensure agreement about performance goals and targets throughout the organization
  • Create a cultural understanding of how metrics should be used
  • Experiment with different analyses to determine which ones can provide business value

Design configurable business intelligence dashboards that can provide needed metrics in real time

According to Loshin, the key goal of any business intelligence program should be to provide performance metrics in a way that is informative, but not intrusive. In other words, business intelligence dashboards need to be highly configurable in order to make sure that business users are getting access to the exact data they need, without falling victim to data paralysis caused by having to sift through all the data they don’t need.

In addition, business intelligence dashboards need to be able to provide updates in real time, in order to ensure that business users are making decisions based on the most current view of metrics.

Provide drill-down capabilities for metrics that are of specific concern for the business

Every organization wants different insights from their business intelligence solutions. As a result, business intelligence dashboards should not be one-size-fits-all in the insights they provide.

If an organization knows in advance that a specific metric could be particularly helpful for their business, they should plan ahead to make sure their BI dashboard includes drill-down capabilities for that metric, so that they will be able to get a deeper level of insight when the need arises.

Ensure agreement about performance goals and targets throughout the organization

What are the most important insights that can be gained from a business intelligence solution? For some organizations, it’s figuring out the best way to generate new revenue. For others, it may be reducing costs or mitigating risks.

Either way, it’s important that all key stakeholders understand the values that matter most to the business, and know how BI metrics will be used to help meet those performance goals and targets.

Create a cultural understanding of how metrics should be used

An efficient business intelligence solution should allow individuals to take independent action, but there should also be an organization-wide understanding of how each individual is expected to use the insights provided by the BI solution.

C-level executives set the standard for what data is important to monitor, but they won’t be the ones actually drilling down into the data. As a result, it’s important that all business users have an understanding of how BI can help improve their decision-making.

Experiment with different analyses to determine which ones can provide business value

Business intelligence is most likely to be successful when it has executive support, but executives will probably only provide support for programs that have demonstrated value in the past. Loshin compares this situation to a chicken/egg problem: business users need executive support to implement quality BI solutions, but they often need to prove the value of business intelligence solutions before they can get executive support.

To overcome this problem, Loshin recommends undertaking a series of short experiments to find which BI analyses can provide business value, while weeding out the ones that can’t. It’s quite likely that many of the tested analyses won’t prove valuable, but the ones that do should provide sufficient return to make the experimentation worthwhile.

For more, read this post on the ROI for business intelligence

Lease Compliance and Disparate Data

Lease Compliance and Resolution of Data

Lease compliance issues in domestic shale have become critical to E&P asset protection. Lease jeopardy has grown into a major component of risk management of upstream resources.

It demands a well-defined, cohesive data management strategy that creates business intelligence from disparate sources.

Land Administration Information Systems Inadequate

Land Administration is a traditional role within an E&P enterprise.  Within Royalty complianceits domain falls the safeguarding of all of a company’s land assets, among them the oil and gas leases.

Landmen are highly specialized members of an E&P exploration and production team whose duties include assisting in the analysis of deeds, leases and other contracts.  A landman’s responsibility typically includes ensuring that the company’s land assets database is maintained and key attributes of lease provisions recorded and updated.

Unfortunately, these talented and versatile members of the upstream resources team are increasingly handicapped in their lease administration duties. This is because information technology hasn’t kept pace with changing demands in the leasing arena.

The data they’re so meticulously maintaining is in a static data repository. As a result it’s unused by advanced oil and gas software with lease analysis and business intelligence capabilities.

Changing Aspects of Lease Compliance

Several factors are driving changes in shale lease compliance.  As competitors move rapidly in vying for a shrinking pool of unconventional land assets of increasingly significant value, lessors are reexamining the royalty structure of their existing leases.

This and falling natural gas prices have helped motivated lessors, newly focused on their monthly royalty payments, to analyze their lease provisions, often with professional legal advice.  This has become a highly litigious area, with more and more attorneys building a lucrative practice area in oil and gas lease compliance and its growing body of case law.

“Royalty and land and lease rights disputes were the most common types of unconventional oil and gas litigation during 2012,” reports Navigant.

“Oil and gas companies involved in unconventional exploration continue to face not only the challenge of differing state laws, but also a constantly evolving legal landscape as landmark cases make their way through state courts,” noted Navigant, referring to Butler v. Powers Estate, litigation based upon the provisions of a 19th century oil and minerals’ lease.

If an E&P company isn’t performing due diligence on lease provision compliance, it is exposing itself to the potential of costly litigation and risking the devaluation or loss of key productive assets. Overnight what was regarded as a stable corporate asset can become a legal liability if lease oversight has failed.

A company lacking a consistently reliable system for proactive lease compliance is apt to be blindsided by an eager attorney who’s found a non-complaint lease under which he can get his fingernails.

Consolidating Disparate Lease Data into Business Intelligence

Most companies already have on hand the data needed to be proactive in their unconventional E&P lease compliance.  It often exists in various forms that convey different meanings in non-integrated data stacks. Examples include the property management data base, the financials system, the reserves model, and the drilling information.

With the right analytical methodology, this disparate data can be consolidated into an information system that provides reports and alerts that help lessen lease jeopardy.

Building a Viable Lease Compliance System

A multi-part, iterative process is used to build and deploy a viable lease compliance application.

First, leases are identified and reviewed, with their critical provisions defined, e.g., key dates, key quantities.  Next, the different systems of record are identified, and the key attributes that define the leases’ critical provisions are mapped to the destination system, with inconsistencies corrected.

At this point the source data is cleansed — incorrect or inaccurate data is modified, deleted or replaced. The resulting “good” data is synchronized to the master data set, with links established to the various source systems.

Integration with the initial lease provision inventory now occurs, with processes put into place to ensure the lease provision inventory remains accurate and current.  Lastly, lease provision tracking is established, with sensitivity cases defined.

Some of the critical sensitivity issues are royalty escalation, cessation of production and continuous drilling. Business process owners are assigned to the system and critical, sensitivity-driven alerts established for key users.

For more, read our post on leveraging unconventional information assets for Upstream…

Bad Oil and Gas Software: Key Concerns

Optimizing Oil and Gas Software

Oil and gas software is an essential component for businesses in the energy industry. It allows them respond to problems more quickly, review historical data more easily and send reports to managers automatically.

Oil and gas software also allows enables the implementation of  a formal process for tracking production as opposed to the collection of spreadsheets that has traditionally been used in this industry.

However, poor oil and gas software can also create problems, which may be classified into the areas of assets, production and revenue.

Assets

Oil and gas software can cause a failure to pay out on a well interest. This can occur when the software Oil and Gas Software: Field Data Capturecalculates the royalty improperly, causing the balance to fall below the minimum pay requirement for that account.

Poor software can also cause a business to over bid or under bid on an asset by miscalculating the expected value of that asset. The primary factor that determines an asset’s value is the current price of crude oil or natural gas, which can fluctuate greatly over time.

Political factors can also have a significant effect on asset valuation, especially in areas characterized by civil unrest. Technological improvements can increase an asset’s value in the future by reducing the recovery costs.

Production

Software can also create problems with production such as inaccurate estimates of production and reduction in actual production. These problems include double-counting the output of one or more wells, causing your company’s total production to appear higher than it actually is.

Software can also cause your company to drill a hole in a poor location that costs more to operate than it produces. The incorrect allocation of production is another problem that can be caused by poor software.

Facilities with multiple wells must allocate resources to each well, which is generally based on each well’s production. Software that reports production incorrectly can result in a sub-optimal allocation of resources.

Revenue

The loss of a lease is one of the most significant problems for an oil and gas company that relates to revenue. This problem typically occurs when a company is unable to successfully market a lease or sustain its production for an extended period of time.

The failure to receive revenue from all well interests can also be a major problem affecting a company’s revenue. Well interests include basic royalties that are paid to the mineral rights owner and overriding royalty interests that are retained by third parties such as geologists.

Well interests include working interests that a company receives after royalties in exchange for exploring, developing and operating the property. Bad software can also reduce a company’s revenue by inaccurately estimating the reserves remaining in a particular well.

For more, read this post on how oil and gas software can improve decision making and forecasting.

Custom Software: DIY Advantages and Disadvantages

Good Planning a Key Differentiator for Custom Software

Custom software can be a great tool to match processes to your business. The recent proliferation of do-it-yourself tools makes this even easier because they allow people who aren’t professional programmers to create their own software.

This change in custom application development is part of the trend towards disruptive innovation, in which an innovation disrupts an existing technology. The ability of disruptive innovation to change traditional value organization and delivery has resulted in tools with bad user interfaces and poor performance.

A recent article in Brainzooming.com provides examples that illustrate the advantages and disadvantages of DIY software development.

DIY Custom Software Advantages

DIY custom software can be a great fit in those cases where they provide greater success than traditional off-the-shelf software. One example would be if a small business needed a basic level of reporting.

A well set-up Excel spreadsheet that is shared across the organization would probably be a fine solution. Even better would be if the business then uploaded that information into Tableau. This would bring a visual component to their report with a fairly low amount of effort.

DIY Disadvantages

This approach isn’t always the answer however. As more users started to use the spreadsheet, it would become bloated and difficult to share. The lack of a good user interface would also probably mean that the owner of the spreadsheet would start to spend more time explaining how to use it.

With the addition of more data, Tableau also can benefit from the kind of sound data management strategies that the average business user is not familiar with.

Sound Strategic Thinking

There are other reasons that DIY software solutions may not be the best fit for your business. Sound strategic thinking can also be a factor.

The following case illustrates why unassisted use of DIY tools doesn’t always work. As Mike from Brainzooming highlights, the organizer of an event created a post-event survey using SurveyMonkey for attendees to complete.

Custom Software: bad PlanningThe categories began with “very satisfied” on the left and progress towards “very dissatisfied” on the right. It’s not obvious to the layperson, but an expert in marketing research would have immediately recognized that these categories were in the opposite order from which surveys typically present them.

Respondents completing the survey may have made their choices based on habit, instead of actually reading the category headings before making their selection.

As a result, the results of this survey are unusable because the organizer has no way of knowing if the satisfaction ratings accurately reflect the respondents’ opinions. DIY tools failed in this case because the application required expertise in marketing research.

The bottom line on DIY custom software is that you should use and even embrace this option when it can provide you with an advantage over traditional methods of software development. However, you need to employ strategic thinking to ensure that your efforts provide the desired result.

For more on this topic, check out our series on the custom software buy versus build spectrum.

Custom Software: Four Moments of Truth

Moments of Truth for Custom Software

During a recent leadership conference the Entrance team began brainstorming how to make our custom software consulting even better. The leadership team has since started an active conversation among our consultant team on this topic.

One of the main points the speaker made was that every business has moments of truth that make all the Custom software: Moments of truthdifference. For a restaurant, great food and service can be destroyed by a dirty floor or cockroaches. For a clothing store, the most stylish dresses can’t be outweighed by long lines and unfriendly clerks.

One of the Entrance values is “Improve everything,” or as some of us say, “Suck less every day.” As apotential client you may be wondering how we live out this value.

We see moments of truth as one huge opportunity to bring this value front and center. The below is directly from the Entrance custom software team themselves. We see this list as just a few of the places that we strive to improve the quality of our work every day!

Four Moments of Truth in Software

  • Any sprint demo

This is the first chance that clients have to see how the Agile methodology works. This isn’t just about selling an idea. It has to meet our client’s needs and efficiently deliver software that works.

  • Fixing custom software bugs

Every custom software application gets bugs once in a while. A good development team will identify the problem and fix it as quickly as possible. It’s just not acceptable to say a bug is fixed if it isn’t.

  • Owning mistakes

By the same token, every team makes mistakes. It’s how that team owns up to them and makes it better that defines this moment of truth.

For one client, the developer communicated to the client about his mistake. He then quickly fixed it. As a result, the client appreciated his work even more than they would have if there had been no mistake at all!

  • Requirements sign-off

This is one of those steps near the end of the custom software process that can make all the difference in terms of satisfaction. The development team and the client sit down to review what was promised and what has been delivered.

This can help bring to the surface any gaps in the final deliverable. If any are discovered, the team can develop a plan for making it right.

Improve Everything with Custom Software

Improving everything is a value that the Entrance team must live out every day. In addition, all of these moments of truth involve a degree of transparency.

As a client, it’s your job clear about what you need and to stay engaged through the process. The result of transparency on Entrance’s side is that you always know where your project is and how we’re delivering on your business need.

For more on quality custom software check out our Agile series, “Getting the Most for Your Money.”

Business Intelligence Deployment Misconceptions

Deploying Business Intelligence

Business intelligence, also commonly referred to as BI throughout the industry, is a piece of technology that allows a business to obtain actionable information that they can then use throughout their day to day operations. While business intelligence solutions certainly have their fair share of advantages, it is also important to realize that they are not the be-all, end-all solution for guidance that many people think they are.

There are certain business intelligence deployment misconceptions that businesses make over and over again to their detriment. Understanding these misconceptions will allow you to successfully avoid them and use BI to its fullest potential.

The Benefits of Business Intelligence

  • The information provided is accurate, fast and most importantly visible to aid with making critical decisions relating to the growth of a business, as well as its movement.
  • Business intelligence can allow for automated report delivery using pre-calculated metrics.
  • Data can be delivered using real-time solutions that increase their accuracy and reduce the overall risk to the business owner.
  • The burden on business managers to consolidate information assets can be greatly reduced through the additional delivery and organizational benefits inherent in the proper implementation of business intelligence solutions.
  • The return on investment for organizations with regards to business intelligence is far reaching and significant.

Business Intelligence Deployment Misconceptions

Business intelligence misconceptionsOne of the most prevalent misconceptions about business intelligence deployment is the idea that the systems are fully automated right out of the box. While it is true that the return on investment for such systems can be quite significant, that is only true if the systems have been designed, managed and deployed properly.

A common misconception is that a single business intelligence tool is all a company needs to get the relevant information to guide themselves into the next phase of their operations. According to Rick Sherman, the founder of Athena IT Solutions, the average Fortune 1000 company implements no less than six different BI tools at any given time.

All of these systems are closely monitored and the information provided by them is then used to guide the business through its operations. No single system will have the accuracy, speed or power to get the job done on its own.

Another widespread misconception is the idea that all companies are using business intelligence in the present term and your company has all the information it needs to in order to stay competitive. In reality, only about 25 percent of all business users have been reported as using BI technology in the past few years. The 25% number is actually a plateau – growth has been stagnant for some time.

One unfortunate misconception involves the idea that “self-service” business intelligence systems indicate that you only need to give users access to the available data to achieve success. In reality, self-service tools often need additional support over what most people plan for.

This support is also required on a continuing basis in order to prevent the systems from returning data that is both incomplete and inconsistent.

One surprising misconception about the deployment of business intelligence is that BI systems have completely replaced the spreadsheet as the ideal tool for analysis. In truth, many experts agree that spreadsheets are and will continue to be the only pervasive business intelligence tool for quite some time.

Spreadsheets, when used to track the right data and perform the proper analysis, have uses that shouldn’t be overlooked. Additionally, business users that find BI deployment too daunting or unwieldy will likely return to spreadsheets for all of their analysis needs.

According to the website Max Metrics, another common misconception is that business intelligence is a tool that is only to be used for basic reporting purposes.

In reality, BI can help business users identify customer behaviors and trends, locate areas of success that may have previously been overlooked and find new ways to effectively target a core audience. BI is a great deal more than just a simple collection of stats and numbers.

For more on this topic, check out our series, “What is business intelligence?