Product Categories
Schlumberger, the leading oilfield services company, is working to increase oil output and cut the industry’s costs. The Schlumberger Software Technology Innovation Centre (STIC) located in the heart of Silicon Valley, is to apply big data technologies, pioneered by Google, Facebook and Amazon, to oil exploration and production. STIC, is focused on “ innovative applications of software technology that will lead the industry's cloud platform transformation, supporting efficient, reliable and predictable real-time end-to-end workflows accessible from anywhere at any time”.
Essentially, the oil and gas industry is creating new business opportunities for information technology, artificial intelligence, big data processing and analytics providers. This application of big data analytics could be just as disruptive and transformative as it was for the Internet.
Several applications are beginning to see the light of day. Data analysis of rocks is helping to place wells more precisely in oil-bearing zones; creation of more accurate reservoir models is helping to maximize output and advanced forms of automation are cutting costs, improving efficiency and operating safety.
These innovations could allow production costs to fall further and reduce oil prices. However, cheaper oil is likely to delay the transition from conventional combustion engines to electric cars in emerging markets and squeeze traditionally high-cost oil–producing nations including Canada, the UK and Russia
Oil companies pioneered the use of computers as early as the 1960s when BP used the most powerful computers of the day to map its Alaskan oil reservoirs. Today, oil companies including France’s Total, Italy’s ENI and Norway-based Petroleum Geo-Services (a reservoir imaging company), employ the world’s most powerful privately owned super computers.
As the internet of things has made it easier to use sensors to collect data throughout the production chain, oil companies are using the latest in cloud computing services to store, process, sort and analyse the constant flow of data at undreamt of speeds and low cost. Now, companies can measure, monitor and model in real-time all aspects of the drilling process. According to Chevrons website, the volume of data it handles doubles every 12 months and its Tengiz oil field in Kazakhstan will have a million sensors when production starts in 2022. Then there is digitization, which allows for greater data storage and with powerful analytical capabilities is making it easier to discover oil.
Oil service companies that cannot afford their own supercomputers are establishing alliances with IT companies to gain access to new cloud-computing data processing and analytics services. For instance, Haliburton uses Microsoft for its big data processing and analysis and Nvidia, the producer of high-performance video gaming microchips, helps Haliburton to adapt its technology for viewing and interpreting seismic survey information. Likewise, Baker Hughes has announced that it is using artificial intelligence to help it extract, process and develop oil and gas finds. Lastly, Schlumberger has launched a new software system called Delfi, which makes it possible to bring together and coordinate the way wells are designed, drilled and brought into production, to maximise output from an entire oilfield.
Matt Rogers, of consultancy McKinsey says that forecasters have failed to fully grasp the magnitude of coming changes, “I don’t think we’ve built into our supply-side models just how much more oil this will provide. The world in 10 years will feel very different. It will feel like we’re in Star Wars compared to where we are now.”