top of page

And just like that “boiling the ocean” has become all the rage (sort of).


A Little Background


The phrase “boiling the ocean” has been frequently applied to a tendency to put a massive amount of effort into a task, commonly involving data or analysis, such that scope or results are beyond the essence of what the task at hand requires.  In a literal analogy, “boiling the ocean” just to capture a subset of desired fish would be overly broad and require massive amounts of energy.  While the use of that phrase has been common, the appropriateness is highly variable and often based on based on any given stakeholder’s perception of the difficulty or benefit.  This has resulted in applying the phrase far too liberally and having the effect of shutting down opportunities to leverage data in beneficial ways.


While well-meaning, many people I’ve heard it from would apply this whenever datasets would get particularly involved and go beyond what Microsoft Excel commonly handles via pivot tables and so forth.  If it began to require even Microsoft Access, there would be some discomfort and questions of whether the analysis is going too deep.  If SQL Server and other enterprise and big data analytics tools were best approach, it was necessary to be extremely judicious about even suggesting them lest the “boiling the ocean” charge gets levied. 


If a particular involved approach required a bit more time, however worthwhile, then even more convincing of the benefit and tradeoffs would be required in attempt win over a concerned stakeholder that the added time, effort, or complexity was indeed worthwhile.  And sometimes, even then the convincing didn’t work.  The prevailing perspective among many people was that most worthwhile analysis can be conducted fairly quickly with summary-level data and common tools like Excel and that if someone was using more than that, it probably meant they were “boiling the ocean”.


What’s Different About Today


Now, as most organizations are embracing the idea of utilizing generative AI in some form, they’re inherently leveraging LLMs, RAG, and other models where the computing power required to train and sometimes analyze the inputs is measured in petaflops…a term which, not too long ago, was reserved to special supercomputers that existed in only a few places.  In order to keep up with this demand for computing power, there is a virtual arms race among cloud computing providers to obtain GPUs that can support the massive amount of computing power need to support the generative AI revolution well into the future. 


As businesses race to identify the best ways to leverage generative AI capabilities, LLMs, RAGs, and other complex data models related tools are necessarily becoming a key part of the discussion.  While this is a far cry from machine learning frameworks, deep learning optimizers, and vector databases being as ubiquitous and widely used as Excel is today, people who might have only lightly used Excel before are gradually becoming conditioned to these concepts, tools, and why they’re important.


Going Forward & Why It Matters


Perhaps, in the not too-distant future, a desktop suite containing such AI tools will actually be as ubiquitous and commonly used as Microsoft Office is today.   In the meantime, it seems as if the tendency to reflexively exclaim “boiling the ocean” upon discussing large datasets and complex models may finally be coming to an end.  While there can always be legitimate cases where the accusation of “boiling the ocean” is applicable, as a broader swath of business leaders, consultants, and middle managers appreciate the need and benefit of going “beyond Excel”, a less reflexive and more judicious and approach will be used to distinguish what's actually appropriate from what's truly excessive.  My inner-geek heart sings.


This deeper understanding and greater discernment around necessary data and complex models will be required for organizations aiming to leverage generative AI and related optimization technologies (think supply chain optimization). In turn, effectively those capabilities including true supply chain optimization will separate the leaders from the losers in the not-too-distant future.  Eventually, this insight and capability will become table stakes for any organization serious about competing at all and, for extreme laggards, playing catchup with a rapid and intense “burning platform” initiative will become increasingly difficult if not impossible to pull off.


My Inner Geek (A Little About Me)


Here’s the thing.  I’ve always been a geek at heart.  In high school in the 1980’s, I used an HP28S Scientific Calculator as my all-around calculator. This model was considered the gold standard and was essentially expected among my other high school “geek” friends.  In college, I’d code for fun, and once I learned about linear and non-linear optimization and supporting software (think MS-DOS era software), I would look forward to finishing mandatory class work just so I could experiment with other models…you know, for fun…on the weekend.

My Actual HP-28S: A Required Geek Accessory In High School (circa 1989)

By the time I first began consulting around 2000 -- to use a Superman/Clark Kent/High School analogy mashup -- I let go of the glasses, pocket-protector, and HP28S calculator to don a varsity letter jacket -- one for cross-country & track though, not football, wrestling, or anything really hardcore, but enough to mask my inner geek beneath a more relatable business façade.  And with the coaching of some mentors and colleagues, I even became fluent in speaking "not-geek" for executive presentations.


I describe my background in jest, but it is fundamentally accurate and why I took such a liking to S&OP and Supply Chain Planning disciplines.  While there are plenty of process-oriented requirements associated with these disciplines, it was clear to me in the early 2000’s that the end game would require Sales & Operations Optimization (or Supply Chain Optimization) aided by leveraging appropriate datasets, robust modeling (moving towards digital twins), and advanced IT capabilities in general.  We are now on the threshold of an era where that will no longer be an ideal to aspire to, but rather a requirement to excel at in order to remain competitive.


Need help on digital transformation and digital twins?


Need help on where to start with a digital transformation plan?  On developing models or building a highly responsive, end-to-end digital twin?   On quantifying the benefits or value of any of this? With a 1-week assessment, we can help identify or confirm transformation opportunities, options, and benefits specific to your organization. Click the button below to arrange an initial discussion. I'd love to speak with you.





 
 
 

Коментарі


Contact Me

Tel: 646-783-9386

kphillips@vcgroup.com

  • LinkedIn Social Icon
  • Twitter Social Icon

© 2020 by Supply Chain Science

Thanks for submitting!

bottom of page