Learning curve : construction Physic

On Construction Physics, we’ve talked frequently about the importance of learning curves as a mechanism by which things get less expensive over time. Learning curves, also known as experience curves or Wright’s Law, track the relationship between cumulative production volume and cost. Specifically, goods tend to show a constant rate of cost decrease (known as the learning rate) for every cumulative doubling of production. Solar PV, for instance, has had a recent learning rate of 44%: for each cumulative doubling of solar PV panel production — 50 to 100 gigawatts, 100 to 200 gigawatts, 200 to 400 gigawatts — costs fall by around 44%.

Learning curves are often conflated with the similar, but distinct, concept of “learning by doing," the idea that the more you do something, the better you get at it. Learning curve gains are often assumed to be the result of learning by doing. The Our World in Data page on learning curves, for instance, states that “That the price of technology declines when more of that technology is produced is a classic case of learning by doing. Increasing production gives the engineers the chance to learn how to improve the process.”

As originally formulated, the learning curve did indeed probably mostly consist of learning by doing. Theodore Wright first used learning curves in 1936 to describe airplane manufacturing. Improvements there probably were driven by experience, as factory staff got more skilled and figured out how to improve the assembly process.

But the more generic relationship between rising production volume and falling cost includes many factors other than learning by doing. With lithium-ion batteries, for instance, progress down the learning curve was in large part due to things like “building higher-capacity equipment and factories to take advantage of economies of scale,” and “discovering new battery chemistries that use less expensive materials.” These aren’t the result of learning by doing.

But even though learning by doing isn’t the only part of the learning curve, it’s nevertheless an important facet of it, and of technological improvement more generally. One way to understand the learning curve is to look specifically at technologies where progress does seem to have been heavily driven by accumulating experience.

One such technology is polycrystalline diamond compact (PDC) drill bits. Over the last 50 years, PDC bits have gradually become the standard drill bit for drilling oil and gas wells, replacing the previous roller-cone bit technology. As PDC bits have improved, they’ve made it possible to drill faster, and to drill for longer without having to change out the bit, both of which make it cheaper to drill a well. By making drilling faster and cheaper, PDC bits have helped enable the shale revolution, and are one of the drivers of progress in enhanced geothermal energy.

The Rise of PDC Bits

Percent of oil and gas well footage drilled by PDC bits.

The nature of drill bits means that progress in them is heavily driven by learning by doing. A drill bit’s job is essentially to drill through rock as quickly as possible, for as long as possible, without breaking or wearing out. Because many costs of drilling are by the hour or day, all else being equal, the more that you can drill in a given amount of time, the cheaper the drilling will be. And changing out a worn or broken drill bit means spending time removing literal miles of drill string (the long shaft of piping that connects the drill bit to the surface), changing out the bit, and then sending the drill back down, a process which can take hours or even days.

Figuring out how to make a bit drill faster and last longer in large part comes from experience: every failed or damaged drill bit is an opportunity to understand the causes behind that failure, and to modify the drill bit and drilling process to try and avoid it. 50 years of constant modifications based on both real-world and laboratory drill bit failures have steadily pushed PDC drilling performance higher and higher.

PDC bits are also a good case study learning by doing because their evolution is unusually well-documented. In general, the sort of factory-floor level improvements that make up learning by doing don’t get written down in a way that’s accessible to the wider world; either out of a desire for secrecy (companies trying to protect their competitive advantages) or out of apathy (no one bothers). But drilling companies seem to be exceptionally open about their drilling experiments and progress, and there’s a great deal of published research from major oil companies documenting their drilling improvement efforts.

Origins of the PDC bit

For most of the 20th century, oil and gas drilling was done primarily using roller-cone bits. The roller-cone bit, invented by Howard Hughes Sr. in 1907³, consisted of two (later three) interlocking, rotating wheels. When the bit was pressed down against the rock and rotated, the wheels rolled along the bottom, and the vertical force crushed the rock beneath the teeth of the bit.⁴ Drilling fluid (known as drilling mud) was pumped out through holes in the bottom of the bit, which kept the bit clean, cooled it, and carried away the broken rock fragments; at the surface, the drilling fluid was cleaned and reused.

Drilling through rock will gradually wear down the bit, requiring an expensive and time-consuming bit change. The stronger the material you can make your bit from, the more durable the bit would be and the longer it will last. The first roller-cone bits were made from steel, and by the 1960s roller-cone bits with even stronger tungsten carbide teeth had been developed. But even as tungsten carbide roller-cone bits began to be adopted, the next phase in drill-bit evolution was fermenting.

In 1955, GE invented a process for producing synthetic diamonds, and by the 1970s it was producing synthetic diamond cutting tools under the trade name COMPAX. These cutting tools consisted of a very thin sheet of tiny synthetic diamond particles, bonded together under high heat and pressure using tungsten carbide. The thin sheet, called a table, would then be attached to a tungsten carbide “slug” to form a cutter, which could then be mounted into a cutting tool.

Synthetic diamond cutters were originally developed for use in conventional machining, but it was clear that they might prove useful in rock drilling. At the time, diamond-tipped rock drill bits were used in some applications (mostly very hard rock), but these bits used natural diamonds which had a tendency to fracture along planes of weakness in the crystal structure, wearing away the cutting edge. A synthetic diamond cutter, made from tiny grains of diamond in random orientation, wouldn’t have this weakness, and would maintain a cutting edge even as it wore down (for this reason PDC cutters are described as “self-sharpening”).

The first PDC drill bits for rock drilling, using GE COMPAX cutters, were built in 1973. Unlike roller-cone bits (which drilled rock by crushing), and natural diamond bits (which drilled rock by grinding), the PDC bit drilled rock by shearing, cutting away the rock like a machine tool. Because rock has less tensile strength than compressive strength, shearing is theoretically more efficient for drilling, taking on the order of a third of the energy that crushing requires. Their more efficient cutting mechanism, combined with the high strength of diamond, suggested that PDC bits could be an enormous improvement over existing, roller-cone bits: early lab tests showed that PDC bits could cut through rock two to four times as fast as roller-cones, and one test showed a PDC cutter lasting 145 times as long as a tungsten carbide cutter.

But field tests revealed serious problems with the bits. Initial PDC bits were essentially existing natural diamond bits, with the natural diamonds replaced with synthetic diamond cutters, but this design didn’t work particularly well, and the bits wore out quickly. Investigation showed that the cutters were too small to get good rates of penetration (ROP), the diamond tables had a tendency to delaminate and break off from the tungsten carbide studs, and the studs themselves tended to break. The drilling fluid also wasn’t properly cooling the bit and carrying away the rock fragments.

Bit designers realized that the different rock breaking mechanism of PDC cutters required entirely new bit designs, and over the next several years experiments by bit manufacturers and drilling companies yielded PDC bits which worked significantly better. Cutters were arranged in a series of “blades," the material of the bit body was changed from steel to a matrix of tungsten carbide (though steel bodies continued to be developed), and the hydraulics of the bit were redesigned for improved drilling fluid flow. On the cutters, the tungsten carbide stud was lengthened, and in 1976 GE released a larger diamond cutter (13mm compared to the original 8mm). Three years later, GE developed an improved brazing process, LS brazing, which greatly reduced the diamond delamination problem.

While much of this development was the result of field trials by various oil companies, early PDC bit improvement was also heavily driven by research at Sandia National Labs. In the wake of the oil embargo, the US government began to investigate a variety of alternative energy sources, one of which was geothermal. It was thought that PDC bits could potentially reduce geothermal drilling costs, as they lacked the seals and moving parts that were a frequent source of failure of roller-cone bits in hot geothermal conditions. In the early 1970s Sandia, partnering with GE, undertook an extensive research program to develop PDC bits for geothermal applications. Through extensive testing (both of individual cutters and entire bits), Sandia investigated hydraulic flow patterns of drilling fluid, rock chip formation mechanics, the effects of different cutter arrangements and different rock types, bit wear rate, and differential heating of bit materials. Then it shared its findings with bit manufacturers and oil companies. Sandia was eventually able to develop a model of PDC cutter and bit behavior that could predict things like cutter temperature, cutter force, and bit wear. A software tool based on this model, PDC WEAR, is still used today to design PDC bits. Sandia’s initial funding of the PDC bit industry, at a time when the bits seemed to lack promise, is credited with saving it.

By the early 1980s, PDC bits had overcome their early teething troubles, and 20 companies were manufacturing PDC bits. But while PDC bits showed great potential, and in some cases performed much better than roller-cone bits, success was intermittent and unpredictable, as a case study of technological transfer notes:

It was not clear where — geologically — PDC bits could be fruitfully used; the drilling industry tried out the new bits with considerable success in parts of Texas, Louisiana, and the North Sea and with rather stunning technological failures in other regions. As one manufacturer advised at the time, “If you remember nothing else about what I have to say, Stratapax or polycrystalline product is very formation-sensitive and it takes tender loving care.” Early commercial reports from the field were mixed; PDC bits had an immediate cost impact in the southwest and central U.S. and the North Sea and yet rates of penetration and time savings were often not good enough to compensate for the much higher cost of PDC bits relative to conventional steel roller-cone bits. The cumulative result of these problems threatened the market success of the bits. Word of drilling problems spread rapidly through the industry and few drilling contractors were willing to try PDC bits. In a short period of time, the bit earned a bad reputation.

PDC bits were also found to perform poorly in hard, abrasive rock, limiting their flexibility. By 1982, less than 2% of well footage worldwide was drilled with PDC bits.

But experimentation and feedback from the field continued to yield results and improvements. Manufacturers began to roughen the surface between the diamond table and the stud with grooves and other patterns, which improved attachment and cutter performance: these non-planar interfaces were first introduced in 1984 and have since become standard. Cutters were made even larger, rising to 19mm in diameter, diamond tables were made even thicker, and chamfers were added to the disk edges. In some rock, particularly clay and shale, cuttings had a tendency to ball up on the drill bit (a phenomena known as bit balling), but this problem was significantly reduced by developing “fishtail” PDC bits with thin cutting blades that were more fully cleaned by the drilling fluid. More ports for drilling fluid to exit the bit were added, and the port location and hydraulic design of the bits was improved. Lab and field studies began to map out the impact of varying design parameters like cutter arrangement, cutter angle and cutter density, and drillers began to learn the operating parameters, like revolutions per minute, torque, and weight on the bit, with which PDC bits worked best. Cutting experiments showed that the size of the diamond grains had a large effect on cutter performance, and manufacturers developed diamond tables with a mix of grain sizes: small grains on the surface for better cutting on top of a layer of larger, tougher grains. And as PDC bits improved, adoption of them crept up: by 1990, PDC bits were responsible for around 5% of oil and gas footage drilled.

Bit whirl

A major advance in PDC drilling in the late 1980s was the discovery of the effects of “bit whirl," a type of vibration in the drill string. Bit whirl is when a drill bit doesn’t just spin, but rotates eccentrically around a center point. A whirling bit changes the path a cutter takes, resulting in loading on it from undesirable directions and impacts that can damage the cutters. Based on field studies of PDC bits, researchers at Amoco discovered that bit whirl was a major cause of damaged cutters and bit failure: cutters were being bashed against the side of the hole, causing damage and rapid wear. Bit failures that were thought to be due primarily to thermal effects (ie: the bit heating up) were discovered in many cases to be primarily due to bit whirl, with damaged cutters rapidly failing under high heat.

The discovery of the effects of bit whirl prompted industry research and development efforts to find ways to mitigate it), resulting in things like the addition of “tracking” cutters to provide grooves for cutters to travel through and low friction “gauge pads” to the edge of the bit. Manufacturers improved their quality control to allow more precise placement of cutters, and more generally focused on designing more balanced bits. Using many of these advances, in 1990 Amoco released an “antiwhirl” bit to the market, which had been designed using a modified version of Sandia’s PDC WEAR program. And outside of efforts to reduce bit whirl, the quality of the diamond tables was increasing. Between 1993 and 1995, diamond cutters got twice as abrasion resistant, and new cutter shapes that were less likely to break were introduced. In the field, rig crews gradually learned how to adjust parameters like bit weight, fluid pressure, fluid flow rate, and RPMs to maximize PDC bit performance and minimize bit whirl.

These efforts helped reduce PDC bit failures, made it possible to use PDC bits in harder rock, and made PDC bits much more appealing: a PDC bit that didn’t break or fail prematurely could drill twice as fast as a roller-cone bit, and last two to three times as long before being replaced. Between 1989 and 1996, average drilling speed of PDC bits had risen 60%, and footage drilled per bit had risen 115%. PDC bits were still expensive — 5 to 15 times the cost of a roller-cone bit — but the cost of the bit itself (2-3% of the well cost) was nothing compared to its impact on overall well costs. 75% of the costs of drilling a well were directly or indirectly affected by bit performance, and the improved efficiency of PDC bits was often worth the added expense. Between 1990 and 1995, the share of well footage drilled by PDC bits jumped from 5 to 15% of the market, and by the mid-1990s PDC bits had become “a standard tool that is used in almost all drilling provinces, rather than being a specialty tool used in only a few specific formations.”

TUNU case study

A case study of an Indonesian oil field demonstrates the impact PDC bits could have on drilling costs. Prior to 1986, wells in the TUNU offshore oil field were entirely driven using roller-cone bits: wells were drilled at an average rate of 6 meters per hour, required 11 bits per well on average, and cost about $470 per meter to drill. On average it took around 13 days to drill a well.

In 1986, PDC fishtail bits were introduced in the TUNU oilfield. Early tests showed around a 20% drilling speed improvement, enough of a success to prompt further experimentation with different types of PDC bits and drilling parameters. As drillers homed in on the best combination of bit design and drilling procedures, drilling performance continued to improve. By 1991, average drilling rate had risen to 10 meters per hour, the number of bits required per well was reduced from 11 to 6, and time per well was cut from 13 days to just over 8 days. Drilling costs fell to $270 per meter.

By 1994, it had become clear that impacts on cutters were causing premature bit failure. Based on analysis of bit behavior in the TUNU wells, new PDC bits were designed with larger, chamfered, impact-resistant cutters, better drilling fluid flow, improved balance, gauge pads, and spiral-shaped blades. Tungsten carbide matrix bit bodies were replaced with steel bodies, and angle of cutters adjusted to allow for greater drilling rates. By the late 1990s, an entire well could be drilled with a single bit at a cost of $70 per meter, and time to drill a well had fallen to less than 3 days. Over 10 years, PDC bits improved the rate of drilling by 800%, increased bit lifetime by 1,000%, and reduced drilling costs by more than 85%.⁵

Savings from PDC Bits

Savings achieved by adopting PDC bits in the TUNU oil field in Indonesia, from 1986 to 1997.

Cutters continue to improve

By 2000, thanks to their continued improvement, PDC bits were responsible for 24% of oil and gas footage drilled worldwide. But there were still limits to where the bits could be used, particularly very hard rock. Often wells would start with PDC bits, but be forced to switch to roller-cone bits for the hard rock in the final portions of the well.

One major limiting factor was the quality of the synthetic diamond cutters. In the early 2000s one of the major cutter manufacturers, US Synthetic, began to investigate whether the diamond itself could be made higher-quality. Though the diamonds were made from synthetic diamond chemically identical to natural diamond, the process of turning diamond grit into solid diamond tables wasn’t perfect, and synthetic diamond cutters weren’t as hard as natural diamond. US Synthetic began to run extensive experiments, tweaking manufacturing process parameters and testing the resulting cutters to determine the impact on performance.

Diamond cutters are made by compressing synthetic diamond grit onto a tungsten carbide disk at very high temperatures and pressures, which fuses the individual grains together in a process called sintering. Over the course of many experiments and cutter tests, US Synthetic learned that if temperatures and pressures were increased, the result was more durable, higher quality diamond disks. Over 20 years US Synthetic figured out how to double the pressure of the production process, raising it from roughly 870,000 pounds per square inch to nearly 1.5 million pounds per square inch. Operating temperatures were similarly nearly doubled. Not only were the resulting diamond tables harder and stronger, but they were more thermally stable, making them more resistant to the high temperatures generated while drilling.

Another major US Synthetic improvement effort was with chemical leaching. The diamond table production process uses liquid cobalt to catalyze bonding the diamond particles together, but once the process is complete, the cobalt becomes an undesirable impurity. Cobalt has a higher thermal coefficient of expansion than diamond, and at high temperatures it will expand more than the surrounding diamond, breaking it apart. However, studies of machine tool cutters in the 1950s showed that the highest temperatures were concentrated in a very thin layer of material at the surface and rapidly dropped below that: if you could remove the cobalt from that thin layer, thermal resistance of the cutters would dramatically improve. As thermal wear is ultimately the dominant factor in overall cutter wear, improving thermal resistance would dramatically improve cutter performance.

The company NOV developed a “cobalt leaching” process for doing this in the late 1990s, and over the next several decades US Synthetic gradually pushed how deep leaching could reach into the diamond table. Between 2005 and 2020, leach depth on US Synthetic cutters increased by a factor of 10, greatly improving the thermal resistance of the cutters.

In addition to these efforts, US Synthetic also experimented with different cutter shapes beyond the simple cylinder. Such shaped cutters are now extensively used in the industry, and the shape is tailored to the specific conditions of the well being drilled.

As a result of these improvement efforts, the durability of US Synthetic cutters improved enormously. A common performance measure of diamond cutters is the grinding ratio, or G-ratio, which measures how much material is worn away from the cutter over time: the higher the ratio, the more durable the cutter. Over 15 years, the G-ratio of US Synthetic cutters rose more than 1,000-fold.

Limiter-based redesign

Equally important as developing more durable, better performing cutters was learning how best to use them, and understanding of PDC bit behavior continued to advance as well. In the early 2000s Fred Dupriest, an engineer working for Exxon Mobil, began to study drilling efficiency using a metric called Mechanical Specific Energy (MSE), the energy required to break up a given volume of rock. The concept of MSE had been developed in the 1960s by R. Teale, who found that MSE was a function of the strength of the rock, and was (theoretically) unaffected by things like drilling rate or drill RPMs. An efficient drill bit that was operating correctly should result in a relatively constant MSE.

MSE was historically used to measure drilling efficiency in the lab, but it wasn’t common to use it in the field: instead, rig crews mostly measured efficiency using metrics like feet drilled in a given day. Because rock conditions differed from well to well, this made it difficult to determine how efficiently a bit was operating, or to suggest solutions. When measuring MSE while drilling in the field, Dupriest discovered that it would often rise significantly just before severe bit damage. Dupriest eventually realized that these jumps in MSE were often due to failure modes unrelated to the strength of the bit, such as excessive vibration or material getting stuck to the bit. If they were addressed, drilling rates could be increased, often substantially, until the physical limits of the bit were reached.

Dupriest eventually developed a drilling improvement system known as “limiter redesign," based on the physics of how a PDC bit drilled rock. When drilling, the weight of the drill string and vertical force from the drilling rig (a value known as weight on bit or WOB) pushes down onto the bit, which presses the cutters into the rock to a certain depth (known as depth of cut or DOC). As the bit rotates, the rock is cut away: the faster the bit rotates (the greater the RPMs) and the higher the depth of cut, the faster it will drill (known as rate of penetration or ROP).

Critically, a higher depth of cut does not make the cutters more likely to fail, since the failure load of the rock is well below the strength of the cutter (which, thanks to the efforts of companies like US Synthetic, had become incredibly strong and durable). Instead, cutter failure is due to the gradual wearing away of the bottom edge of the cutter (known as the wear flat) due to friction as it slides. The wear rate was found to be a function of distance travelled; because greater weight on bit meant a greater depth of cut (and thus more material removed for a given amount of cutter sliding), drilling faster would actually wear the bit less. Theoretically, a driller was best served by increasing the weight on bit to the maximum allowed by the bits’ design, which would allow for the highest possible ROP.

However, in practice a bit would likely encounter other failure modes far before maximum weight on bit was achieved. These “limiters," diagnosed by a rise in MSE, would prevent the bit from efficiently transferring energy into the rock, preventing further increases in ROP and likely resulting in damage to the bit. But these limiters could be relaxed by redesigning the bit or by changing drilling parameters, avoiding bit damage and allowing for further increases in ROP until the next limiter was encountered. By addressing each limiter as it arrived, ROP could be increased until the physical limits of the bit were reached.

Dupriest ultimately described five major categories of bit limiters: bottomhole balling (cut rock particles accumulating below the drill bit), bit whirl, inter

Early PDC drill bit.

On Construction Physics, we’ve talked frequently about the importance of learning curves as a mechanism by which things get less expensive over time. Learning curves, also known as experience curves or Wright’s Law, track the relationship between cumulative production volume and cost. Specifically, goods tend to show a constant rate of cost decrease (known as the learning rate) for every cumulative doubling of production. Solar PV, for instance, has had a recent learning rate of 44%: for each cumulative doubling of solar PV panel production — 50 to 100 gigawatts, 100 to 200 gigawatts, 200 to 400 gigawatts — costs fall by around 44%.

Learning curves are often conflated with the similar, but distinct, concept of “learning by doing," the idea that the more you do something, the better you get at it. Learning curve gains are often assumed to be the result of learning by doing. The Our World in Data page on learning curves, for instance, states that “That the price of technology declines when more of that technology is produced is a classic case of learning by doing. Increasing production gives the engineers the chance to learn how to improve the process.”

As originally formulated, the learning curve did indeed probably mostly consist of learning by doing. Theodore Wright first used learning curves in 1936 to describe airplane manufacturing. Improvements there probably were driven by experience, as factory staff got more skilled and figured out how to improve the assembly process.¹

But the more generic relationship between rising production volume and falling cost includes many factors other than learning by doing. With lithium-ion batteries, for instance, progress down the learning curve was in large part due to things like “building higher-capacity equipment and factories to take advantage of economies of scale,” and “discovering new battery chemistries that use less expensive materials.” These aren’t the result of learning by doing.²

But even though learning by doing isn’t the only part of the learning curve, it’s nevertheless an important facet of it, and of technological improvement more generally. One way to understand the learning curve is to look specifically at technologies where progress does seem to have been heavily driven by accumulating experience.

One such technology is polycrystalline diamond compact (PDC) drill bits. Over the last 50 years, PDC bits have gradually become the standard drill bit for drilling oil and gas wells, replacing the previous roller-cone bit technology. As PDC bits have improved, they’ve made it possible to drill faster, and to drill for longer without having to change out the bit, both of which make it cheaper to drill a well. By making drilling faster and cheaper, PDC bits have helped enable the shale revolution, and are one of the drivers of progress in enhanced geothermal energy.

The Rise of PDC Bits

Percent of oil and gas well footage drilled by PDC bits.

The nature of drill bits means that progress in them is heavily driven by learning by doing. A drill bit’s job is essentially to drill through rock as quickly as possible, for as long as possible, without breaking or wearing out. Because many costs of drilling are by the hour or day, all else being equal, the more that you can drill in a given amount of time, the cheaper the drilling will be. And changing out a worn or broken drill bit means spending time removing literal miles of drill string (the long shaft of piping that connects the drill bit to the surface), changing out the bit, and then sending the drill back down, a process which can take hours or even days.

Figuring out how to make a bit drill faster and last longer in large part comes from experience: every failed or damaged drill bit is an opportunity to understand the causes behind that failure, and to modify the drill bit and drilling process to try and avoid it. 50 years of constant modifications based on both real-world and laboratory drill bit failures have steadily pushed PDC drilling performance higher and higher.

PDC bits are also a good case study learning by doing because their evolution is unusually well-documented. In general, the sort of factory-floor level improvements that make up learning by doing don’t get written down in a way that’s accessible to the wider world; either out of a desire for secrecy (companies trying to protect their competitive advantages) or out of apathy (no one bothers). But drilling companies seem to be exceptionally open about their drilling experiments and progress, and there’s a great deal of published research from major oil companies documenting their drilling improvement efforts.

How a roller-cone bit works, via Ozdemir et al 2022.

Drilling through rock will gradually wear down the bit, requiring an expensive and time-consuming bit change. The stronger the material you can make your bit from, the more durable the bit would be and the longer it will last. The first roller-cone bits were made from steel, and by the 1960s roller-cone bits with even stronger tungsten carbide teeth had been developed. But even as tungsten carbide roller-cone bits began to be adopted, the next phase in drill-bit evolution was fermenting.

In 1955, GE invented a process for producing synthetic diamonds, and by the 1970s it was producing synthetic diamond cutting tools under the trade name COMPAX. These cutting tools consisted of a very thin sheet of tiny synthetic diamond particles, bonded together under high heat and pressure using tungsten carbide. The thin sheet, called a table, would then be attached to a tungsten carbide “slug” to form a cutter, which could then be mounted into a cutting tool.

Synthetic diamond cutters were originally developed for use in conventional machining, but it was clear that they might prove useful in rock drilling. At the time, diamond-tipped rock drill bits were used in some applications (mostly very hard rock), but these bits used natural diamonds which had a tendency to fracture along planes of weakness in the crystal structure, wearing away the cutting edge. A synthetic diamond cutter, made from tiny grains of diamond in random orientation, wouldn’t have this weakness, and would maintain a cutting edge even as it wore down (for this reason PDC cutters are described as “self-sharpening”).

COMPAX cutter, via A Bit of History.

The first PDC drill bits for rock drilling, using GE COMPAX cutters, were built in 1973. Unlike roller-cone bits (which drilled rock by crushing), and natural diamond bits (which drilled rock by grinding), the PDC bit drilled rock by shearing, cutting away the rock like a machine tool. Because rock has less tensile strength than compressive strength, shearing is theoretically more efficient for drilling, taking on the order of a third of the energy that crushing requires. Their more efficient cutting mechanism, combined with the high strength of diamond, suggested that PDC bits could be an enormous improvement over existing, roller-cone bits: early lab tests showed that PDC bits could cut through rock two to four times as fast as roller-cones, and one test showed a PDC cutter lasting 145 times as long as a tungsten carbide cutter.

Rock failure modes of different bit types, via Gill et al 1985.

But field tests revealed serious problems with the bits. Initial PDC bits were essentially existing natural diamond bits, with the natural diamonds replaced with synthetic diamond cutters, but this design didn’t work particularly well, and the bits wore out quickly. Investigation showed that the cutters were too small to get good rates of penetration (ROP), the diamond tables had a tendency to delaminate and break off from the tungsten carbide studs, and the studs themselves tended to break. The drilling fluid also wasn’t properly cooling the bit and carrying away the rock fragments.

Bit designers realized that the different rock breaking mechanism of PDC cutters required entirely new bit designs, and over the next several years experiments by bit manufacturers and drilling companies yielded PDC bits which worked significantly better. Cutters were arranged in a series of “blades," the material of the bit body was changed from steel to a matrix of tungsten carbide (though steel bodies continued to be developed), and the hydraulics of the bit were redesigned for improved drilling fluid flow. On the cutters, the tungsten carbide stud was lengthened, and in 1976 GE released a larger diamond cutter (13mm compared to the original 8mm). Three years later, GE developed an improved brazing process, LS brazing, which greatly reduced the diamond delamination problem.

Early PDC bit evolution, via Kerr 1988.

While much of this development was the result of field trials by various oil companies, early PDC bit improvement was also heavily driven by research at Sandia National Labs. In the wake of the oil embargo, the US government began to investigate a variety of alternative energy sources, one of which was geothermal. It was thought that PDC bits could potentially reduce geothermal drilling costs, as they lacked the seals and moving parts that were a frequent source of failure of roller-cone bits in hot geothermal conditions. In the early 1970s Sandia, partnering with GE, undertook an extensive research program to develop PDC bits for geothermal applications. Through extensive testing (both of individual cutters and entire bits), Sandia investigated hydraulic flow patterns of drilling fluid, rock chip formation mechanics, the effects of different cutter arrangements and different rock types, bit wear rate, and differential heating of bit materials. Then it shared its findings with bit manufacturers and oil companies. Sandia was eventually able to develop a model of PDC cutter and bit behavior that could predict things like cutter temperature, cutter force, and bit wear. A software tool based on this model, PDC WEAR, is still used today to design PDC bits. Sandia’s initial funding of the PDC bit industry, at a time when the bits seemed to lack promise, is credited with saving it.

By the early 1980s, PDC bits had overcome their early teething troubles, and 20 companies were manufacturing PDC bits. But while PDC bits showed great potential, and in some cases performed much better than roller-cone bits, success was intermittent and unpredictable, as a case study of technological transfer notes:

It was not clear where — geologically — PDC bits could be fruitfully used; the drilling industry tried out the new bits with considerable success in parts of Texas, Louisiana, and the North Sea and with rather stunning technological failures in other regions. As one manufacturer advised at the time, “If you remember nothing else about what I have to say, Stratapax or polycrystalline product is very formation-sensitive and it takes tender loving care.” Early commercial reports from the field were mixed; PDC bits had an immediate cost impact in the southwest and central U.S. and the North Sea and yet rates of penetration and time savings were often not good enough to compensate for the much higher cost of PDC bits relative to conventional steel roller-cone bits. The cumulative result of these problems threatened the market success of the bits. Word of drilling problems spread rapidly through the industry and few drilling contractors were willing to try PDC bits. In a short period of time, the bit earned a bad reputation.

PDC bits were also found to perform poorly in hard, abrasive rock, limiting their flexibility. By 1982, less than 2% of well footage worldwide was drilled with PDC bits.

But experimentation and feedback from the field continued to yield results and improvements. Manufacturers began to roughen the surface between the diamond table and the stud with grooves and other patterns, which improved attachment and cutter performance: these non-planar interfaces were first introduced in 1984 and have since become standard. Cutters were made even larger, rising to 19mm in diameter, diamond tables were made even thicker, and chamfers were added to the disk edges. In some rock, particularly clay and shale, cuttings had a tendency to ball up on the drill bit (a phenomena known as bit balling), but this problem was significantly reduced by developing “fishtail” PDC bits with thin cutting blades that were more fully cleaned by the drilling fluid. More ports for drilling fluid to exit the bit were added, and the port location and hydraulic design of the bits was improved. Lab and field studies began to map out the impact of varying design parameters like cutter arrangement, cutter angle and cutter density, and drillers began to learn the operating parameters, like revolutions per minute, torque, and weight on the bit, with which PDC bits worked best. Cutting experiments showed that the size of the diamond grains had a large effect on cutter performance, and manufacturers developed diamond tables with a mix of grain sizes: small grains on the surface for better cutting on top of a layer of larger, tougher grains. And as PDC bits improved, adoption of them crept up: by 1990, PDC bits were responsible for around 5% of oil and gas footage drilled.

Fishtail PDC bit, via Feenstra 1988.

PDC developments between 1982 and 1986, via Falcone 2005.

Bit whirl

A major advance in PDC drilling in the late 1980s was the discovery of the effects of “bit whirl," a type of vibration in the drill string. Bit whirl is when a drill bit doesn’t just spin, but rotates eccentrically around a center point. A whirling bit changes the path a cutter takes, resulting in loading on it from undesirable directions and impacts that can damage the cutters. Based on field studies of PDC bits, researchers at Amoco discovered that bit whirl was a major cause of damaged cutters and bit failure: cutters were being bashed against the side of the hole, causing damage and rapid wear. Bit failures that were thought to be due primarily to thermal effects (ie: the bit heating up) were discovered in many cases to be primarily due to bit whirl, with damaged cutters rapidly failing under high heat.

How bit whirl damages cutters, via Warren 1994.

The discovery of the effects of bit whirl prompted industry research and development efforts to find ways to mitigate it), resulting in things like the addition of “tracking” cutters to provide grooves for cutters to travel through and low friction “gauge pads” to the edge of the bit. Manufacturers improved their quality control to allow more precise placement of cutters, and more generally focused on designing more balanced bits. Using many of these advances, in 1990 Amoco released an “antiwhirl” bit to the market, which had been designed using a modified version of Sandia’s PDC WEAR program. And outside of efforts to reduce bit whirl, the quality of the diamond tables was increasing. Between 1993 and 1995, diamond cutters got twice as abrasion resistant, and new cutter shapes that were less likely to break were introduced. In the field, rig crews gradually learned how to adjust parameters like bit weight, fluid pressure, fluid flow rate, and RPMs to maximize PDC bit performance and minimize bit whirl.

These efforts helped reduce PDC bit failures, made it possible to use PDC bits in harder rock, and made PDC bits much more appealing: a PDC bit that didn’t break or fail prematurely could drill twice as fast as a roller-cone bit, and last two to three times as long before being replaced. Between 1989 and 1996, average drilling speed of PDC bits had risen 60%, and footage drilled per bit had risen 115%. PDC bits were still expensive — 5 to 15 times the cost of a roller-cone bit — but the cost of the bit itself (2-3% of the well cost) was nothing compared to its impact on overall well costs. 75% of the costs of drilling a well were directly or indirectly affected by bit performance, and the improved efficiency of PDC bits was often worth the added expense. Between 1990 and 1995, the share of well footage drilled by PDC bits jumped from 5 to 15% of the market, and by the mid-1990s PDC bits had become “a standard tool that is used in almost all drilling provinces, rather than being a specialty tool used in only a few specific formations.”

TUNU case study

A case study of an Indonesian oil field demonstrates the impact PDC bits could have on drilling costs. Prior to 1986, wells in the TUNU offshore oil field were entirely driven using roller-cone bits: wells were drilled at an average rate of 6 meters per hour, required 11 bits per well on average, and cost about $470 per meter to drill. On average it took around 13 days to drill a well.

In 1986, PDC fishtail bits were introduced in the TUNU oilfield. Early tests showed around a 20% drilling speed improvement, enough of a success to prompt further experimentation with different types of PDC bits and drilling parameters. As drillers homed in on the best combination of bit design and drilling procedures, drilling performance continued to improve. By 1991, average drilling rate had risen to 10 meters per hour, the number of bits required per well was reduced from 11 to 6, and time per well was cut from 13 days to just over 8 days. Drilling costs fell to $270 per meter.

By 1994, it had become clear that impacts on cutters were causing premature bit failure. Based on analysis of bit behavior in the TUNU wells, new PDC bits were designed with larger, chamfered, impact-resistant cutters, better drilling fluid flow, improved balance, gauge pads, and spiral-shaped blades. Tungsten carbide matrix bit bodies were replaced with steel bodies, and angle of cutters adjusted to allow for greater drilling rates. By the late 1990s, an entire well could be drilled with a single bit at a cost of $70 per meter, and time to drill a well had fallen to less than 3 days. Over 10 years, PDC bits improved the rate of drilling by 800%, increased bit lifetime by 1,000%, and reduced drilling costs by more than 85%.⁵

Savings from PDC Bits

Savings achieved by adopting PDC bits in the TUNU oil field in Indonesia, from 1986 to 1997.

Cutters continue to improve

By 2000, thanks to their continued improvement, PDC bits were responsible for 24% of oil and gas footage drilled worldwide. But there were still limits to where the bits could be used, particularly very hard rock. Often wells would start with PDC bits, but be forced to switch to roller-cone bits for the hard rock in the final portions of the well.

One major limiting factor was the quality of the synthetic diamond cutters. In the early 2000s one of the major cutter manufacturers, US Synthetic, began to investigate whether the diamond itself could be made higher-quality. Though the diamonds were made from synthetic diamond chemically identical to natural diamond, the process of turning diamond grit into solid diamond tables wasn’t perfect, and synthetic diamond cutters weren’t as hard as natural diamond. US Synthetic began to run extensive experiments, tweaking manufacturing process parameters and testing the resulting cutters to determine the impact on performance.

Diamond cutters are made by compressing synthetic diamond grit onto a tungsten carbide disk at very high temperatures and pressures, which fuses the individual grains together in a process called sintering. Over the course of many experiments and cutter tests, US Synthetic learned that if temperatures and pressures were increased, the result was more durable, higher quality diamond disks. Over 20 years US Synthetic figured out how to double the pressure of the production process, raising it from roughly 870,000 pounds per square inch to nearly 1.5 million pounds per square inch. Operating temperatures were similarly nearly doubled. Not only were the resulting diamond tables harder and stronger, but they were more thermally stable, making them more resistant to the high temperatures generated while drilling.

Another major US Synthetic improvement effort was with chemical leaching. The diamond table production process uses liquid cobalt to catalyze bonding the diamond particles together, but once the process is complete, the cobalt becomes an undesirable impurity. Cobalt has a higher thermal coefficient of expansion than diamond, and at high temperatures it will expand more than the surrounding diamond, breaking it apart. However, studies of machine tool cutters in the 1950s showed that the highest temperatures were concentrated in a very thin layer of material at the surface and rapidly dropped below that: if you could remove the cobalt from that thin layer, thermal resistance of the cutters would dramatically improve. As thermal wear is ultimately the dominant factor in overall cutter wear, improving thermal resistance would dramatically improve cutter performance.

The company NOV developed a “cobalt leaching” process for doing this in the late 1990s, and over the next several decades US Synthetic gradually pushed how deep leaching could reach into the diamond table. Between 2005 and 2020, leach depth on US Synthetic cutters increased by a factor of 10, greatly improving the thermal resistance of the cutters.

In addition to these efforts, US Synthetic also experimented with different cutter shapes beyond the simple cylinder. Such shaped cutters are now extensively used in the industry, and the shape is tailored to the specific conditions of the well being drilled.

As a result of these improvement efforts, the durability of US Synthetic cutters improved enormously. A common performance measure of diamond cutters is the grinding ratio, or G-ratio, which measures how much material is worn away from the cutter over time: the higher the ratio, the more durable the cutter. Over 15 years, the G-ratio of US Synthetic cutters rose more than 1,000-fold.

Limiter-based redesign

Equally important as developing more durable, better performing cutters was learning how best to use them, and understanding of PDC bit behavior continued to advance as well. In the early 2000s Fred Dupriest, an engineer working for Exxon Mobil, began to study drilling efficiency using a metric called Mechanical Specific Energy (MSE), the energy required to break up a given volume of rock. The concept of MSE had been developed in the 1960s by R. Teale, who found that MSE was a function of the strength of the rock, and was (theoretically) unaffected by things like drilling rate or drill RPMs. An efficient drill bit that was operating correctly should result in a relatively constant MSE.

MSE was historically used to measure drilling efficiency in the lab, but it wasn’t common to use it in the field: instead, rig crews mostly measured efficiency using metrics like feet drilled in a given day. Because rock conditions differed from well to well, this made it difficult to determine how efficiently a bit was operating, or to suggest solutions. When measuring MSE while drilling in the field, Dupriest discovered that it would often rise significantly just before severe bit damage. Dupriest eventually realized that these jumps in MSE were often due to failure modes unrelated to the strength of the bit, such as excessive vibration or material getting stuck to the bit. If they were addressed, drilling rates could be increased, often substantially, until the physical limits of the bit were reached.

Dupriest eventually developed a drilling improvement system known as “limiter redesign," based on the physics of how a PDC bit drilled rock. When drilling, the weight of the drill string and vertical force from the drilling rig (a value known as weight on bit or WOB) pushes down onto the bit, which presses the cutters into the rock to a certain depth (known as depth of cut or DOC). As the bit rotates, the rock is cut away: the faster the bit rotates (the greater the RPMs) and the higher the depth of cut, the faster it will drill (known as rate of penetration or ROP).

Critically, a higher depth of cut does not make the cutters more likely to fail, since the failure load of the rock is well below the strength of the cutter (which, thanks to the efforts of companies like US Synthetic, had become incredibly strong and durable). Instead, cutter failure is due to the gradual wearing away of the bottom edge of the cutter (known as the wear flat) due to friction as it slides. The wear rate was found to be a function of distance travelled; because greater weight on bit meant a greater depth of cut (and thus more material removed for a given amount of cutter sliding), drilling faster would actually wear the bit less. Theoretically, a driller was best served by increasing the weight on bit to the maximum allowed by the bits’ design, which would allow for the highest possible ROP.

Wearing surface on a PDC cutter, via Rostamsowlat et al 2022.

However, in practice a bit would likely encounter other failure modes far before maximum weight on bit was achieved. These “limiters," diagnosed by a rise in MSE, would prevent the bit from efficiently transferring energy into the rock, preventing further increases in ROP and likely resulting in damage to the bit. But these limiters could be relaxed by redesigning the bit or by changing drilling parameters, avoiding bit damage and allowing for further increases in ROP until the next limiter was encountered. By addressing each limiter as it arrived, ROP could be increased until the physical limits of the bit were reached.

Bit limiters, from Dupriest 2020.

Dupriest ultimately described five major categories of bit limiters: bottomhole balling (cut rock particles accumulating below the drill bit), bit whirl, interfacial severity (transition between hard and soft rock causing sudden torque changes), bit balling, and stick slip (differential rotation speed of different parts of the drill string caused when the bit gets stuck).⁶ Depending on the limiter encountered, and the type of rock being drilled through, different interventions would be used to resolve them. In an oil well in Qatar, for instance, an increase in MSE was diagnosed as resulting from bit balling in soft limestone. To resolve this, the drilling fluid nozzles on the bit were changed to smaller ones, and the rate of drilling fluid flow was increased. This more thoroughly cleaned the bit, resolving the bit-balling problem and immediately decreased MSE.

Within a few years, Dupriest had trained thousands of contractors in limiter-based redesign, and today the strategies have spread from Exxon Mobil and become common in the industry. In some cases these methods allowed for a four-fold or greater improvement in drilling speed.

.. View entire message facial severity (transition between hard and soft rock causing sudden torque changes), bit balling, and stick slip (differential rotation speed of different parts of the drill string caused when the bit gets stuck).Depending on the limiter encountered, and the type of rock being drilled through, different interventions would be used to resolve them. In an oil well in Qatar, for instance, an increase in MSE was diagnosed as resulting from bit balling in soft limestone. To resolve this, the drilling fluid nozzles on the bit were changed to smaller ones, and the rate of drilling fluid flow was increased. This more thoroughly cleaned the bit, resolving the bit-balling problem and immediately decreased MSE.

Within a few years, Dupriest had trained thousands of contractors in limiter-based redesign, and today the strategies have spread from Exxon Mobil and become common in the industry. In some cases these methods allowed for a four-fold or greater improvement in drilling speed.

.Source: The evolution of polycrystalline diamond drill bits

Brian Potter

Dec 12, 2024

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics