Big Data

New Perspectives on Business Ecosystems

Posted on Updated on

One of the many changes resulting from the COVID-19 pandemic has been a sea change in thoughts and goals around Supply Chain Management (SCM). Existing SCM systems were up-ended in mere months as it has become challenging to procure raw materials to components, manufacturing has shifted to meet new unanticipated needs, and logistics challenges have arisen out of health-related staffing issues, safe working distances, and limited shipping options and availability. In short, things are a mess!

Foundational business changes will require modern approaches to Change Management. Change is not easy – especially at scale, so having ongoing support from the top down and providing incentives to motivate the right behaviors, actions, and outcomes will especially critical to the success of those initiatives. And remember, “What gets measured gets managed,” so focusing on the aspects of business and change that really matter will become a greater focus.

Business Intelligence systems will be especially important for Descriptive Analysis. Machine Learning will likely begin to play a larger role as organizations seek a more comprehensive understanding of patterns and work towards accurate Predictive Analysis. And of course, Artificial Intelligence / Deep Learning / Neural Networks use should accelerate as the need for Prescriptive Analysis grows. Technology will provide many of the insights needed for business leaders to make the best decisions in the shortest amount of time that is both possible and prudent.

This is also the right time to consider upgrading to a modern business ecosystem that is collaborative, agile, and has the ability to quickly and cost-effectively expand and adapt to whatever comes next. Click on this link to see more of the benefits of this type of model.

Man's forearm and hand, index finger extended to point to one of a series of "digital keys"

Whether you like it or not, change is coming. So, why not take a proactive posture to help ensure that this change is good and meets the objectives your company or organization needs.

Changes like this are all-encompassing so it is helpful to begin with the mindset of, “Win together, Lose together.” In general, it helps to have all areas of an organization moving in lockstep towards a common goal but at a critical juncture like this that is no longer an option.

Blockchain, Data Governance, and Smart Contracts in a Post-COVID-19 World

Posted on Updated on

The last few months have been very disruptive to nearly everyone across the globe. There are business challenges galore; such has managing large remote workforces – many of whom are new to working remotely, and managing risk while attempting to conduct “business as usual.” Unfortunately for most businesses, their systems, processes, and internal controls were not designed for this “new normal.”

While there have been many predictions around Blockchain for the past few years it is still not widely adopted. We are beginning to see an uptick in adoption with Supply Chain Management Systems for reasons that include traceability of items – especially food and drugs. But large-scale adoption has been elusive to date.

Image of globe with network of connected dots in the space above it.

My personal belief is that we will soon begin to see large shifts in mindset, investments, and effort towards modern digital technology driven by Data Governance and Risk Management. I also believe that this will lead to these technologies becoming easier to use via new platforms and integration tools, and that will lead to faster adoption by SMBs and other non-Enterprise organizations

Here are a few predictions:

  1. New wearable technology supporting Medical IoT will be developed to help provide an early warning system for disease and future pandemics. That will fuel a number of innovations in various industries including Biotech and Pharma.
    • Blockchain can provide the necessary data privacy, data ownership, and data provenance to ensure the veracity of that data.
    • New legislation will be created to protect medical providers and other users of that data from being liable for missing information or trends that could have saved lives or avoided some other negative outcome.
    • In the meantime, Hospitals, Insurance Providers, and others will do everything possible to mitigate the risk of using the Medical IoT data, which could include Smart Contracts as a way to ensure compliance (which assumes that there is a benefit being provided to the data providers).
    • Platforms may be created to offer individuals control over their own data, how it is used and by whom, ownership of that data, and payment for the use of that data. This is something that I wrote about in 2013.
  2. Data Governance will be taken more seriously by every business. Today companies talk about Data Privacy, Data Security, or Data Consistency, but few have a strategic end-to-end systematic approach to managing and protecting their data and their company.
    • Comprehensive Data Governance will become both a driving and gating force as organizations modernize and grow. Even before the pandemic there were growing needs due to new data privacy laws and concerns around areas such as the data used for Machine Learning.
    • In a business environment where more systems are distributed there is increased risk of data breaches and cybercrime. That will need to be addressed as a foundational component of any new system.
    • One or two Data Integration Companies will emerge as undisputed industry leaders due to their capabilities around MDM, Data Provenance & Traceability, and Data Access (an area typically managed by application systems).
    • New standardized APIs akin to HL7 FHIR will be created to support a variety of industries as well as interoperability between systems and industries.
  3. Anything that can be maintained and managed in a secure and flexible distributed digital environment will be implemented as a way to allow companies to quickly pivot and adapt to new challenges and opportunities on a global scale.
    1. Smart Contracts and Digital Currency Payment Processing Systems will likely be core components of those systems.
    1. This will also foster the growth of next generation Business Ecosystems and collaborations that will be more dynamic in nature.

All in all this is exciting from a business and technology perspective. It will require most companies to review and adjust their strategies and tactics to embrace these concepts and adapt to the coming New Normal.

The steps we take today will shape what we see and do in the coming decade so it is important to quickly get this right, knowing that whatever is implemented today will evolve and improve over time.

Good Article on Why AI Projects Fail

Posted on Updated on

high angle photo of robot
Photo by Alex Knight on Pexels.com

Today I ran across this article that was very good as it focused on lessons learned, which potentially helps everyone interested in these topics. It contained a good mix of problems at a non-technical level.

Below is the link to the article, as well as commentary on the Top 3 items listed from my perspective.

https://www.cio.com/article/3429177/6-reasons-why-ai-projects-fail.html

Item #1: 

The article starts by discussing how the “problem” being evaluated was misstated using technical terms. It led me to believe that at least some of these efforts are conducted “in a vacuum.” That was a surprise given the cost and strategic importance of getting these early-adopter AI projects right.

In Sales and Marketing you start the question, “What problem are we trying to solve?” and evolve that to, “How would customers or prospects describe this problem in their own words?” Without that understanding, you can neither initially vet the solution nor quickly qualify the need for your solution when speaking with those customers or prospects. That leaves a lot of room for error when transitioning from strategy to execution.

Increased collaboration with Business would likely have helped. This was touched on at the end of the article under “Cultural challenges,” but the importance seemed to be downplayed. Lessons learned are valuable – especially when you are able to learn from the mistakes of others. To me, this should have been called out early as a major lesson learned.

Item #2: 

This second area had to do with the perspective of the data, whether that was the angle of the subject in photographs (overhead from a drone vs horizontal from the shoreline) or the type of customer data evaluated (such as from a single source) used to train the ML algorithm.

That was interesting because it appears that assumptions may have played a part in overlooking other aspects of the problem, or that the teams may have been overly confident about obtaining the correct results using the data available. In the examples cited those teams did figure those problems out and took corrective action. A follow-on article describing the process used to make their root cause determination in each case would be very interesting.

As an aside, from my perspective, this is why Explainable AI is so important. There are times that you just don’t know what you don’t know (the unknown unknowns). Being able to understand why and on what the AI is basing its decisions should help with providing better quality curated data up-front, as well as being able to identify potential drifts in the wrong direction while it is still early enough to make corrections without impacting deadlines or deliverables.

Item #3: 

This didn’t surprise me but should be a cause for concern as advances are made at faster rates and potentially less validation is made as organizations race to be first to market with some AI-based competitive advantage. The last paragraph under ‘Training data bias’ stated that based on a PWC survey, “only 25 percent of respondents said they would prioritize the ethical implications of an AI solution before implementing it.

Bonus Item:

The discussion about the value of unstructured data was very interesting, especially when you consider:

  1. The potential for NLU (natural language understanding) products in conjunction with ML and AI.
  2. The importance of semantic data analysis relative to any ML effort.
  3. The incredible value that products like MarkLogic’s database or Franz’s AllegroGraph provide over standard Analytics Database products.
    • I personally believe that the biggest exception to assertion this will be from GPU databases (like OmniSci) that easily handle streaming data, can accomplish extreme computational feats well beyond those of traditional CPU based products, and have geospatial capabilities that provide an additional dimension of insight to the problem being solved.

 

Update: This is a link to a related article that discusses trends in areas of implementation, important considerations, and the potential ROI of AI projects: https://www.fastcompany.com/90387050/reduce-the-hype-and-find-a-plan-how-to-adopt-an-ai-strategy

This is definitely an exciting space that will experience significant growth over the next 3-5 years. The more information, experiences, and lessons learned shared the better it will be for everyone.

The Unsung Hero of Big Data

Posted on Updated on

Earlier this week I was reading a blog post regarding the recent Gartner Hype Cycle for Advanced Analytics and Data Science, 2015. The Gartner chart reminded me of the epigram, “Plus ça change, plus c’est la même chose” (asserting that history repeats itself by stating the more things change, the more they stay the same.)

To some extent that is true, as you could consider today’s Big Data as derivative of yesterday’s VLDBs (very large databases) and Data Warehouses. One of the biggest changes IMO is the shift away from Star Schemas and practices implemented for performance reasons such as aggregation of data sets, use of derived and encoded values, the use of surrogate and foreign keys to establish linkage, etc. Going forward it may not be possible to have that much rigidity and be as responsive as needed from a competitive perspective.

There are many dimensions to big data: Huge sample of data (volume), which becomes your universal set and supports deep analysis as well as temporal and spatial analysis; A variety of data (structured and unstructured) that often does not lend itself to SQL based analytics; and often data streaming in (velocity) from multiple sources – an area that will become even more important in the era of the Internet of Things. These are the “Three V’s” that people have been talking about for the past five years.

Like many people, my interest in Object Database technology initially waned in the late 1990’s. That is, until about four years ago when a project at work led me back in this direction. As I dug into the various products I learned that they were alive and doing very well in several niche areas. That finding led to a better understanding of the real value of object databases.

Some products try to be, “All Vs to all people,” but generally what works best is a complementary, integrated set of tools working together as services within a single platform. It makes a lot of sense. So, back to object databases.

One of the things I like most about my job is the business development aspect. One of the product families I’m responsible for is Versant. With the Versant Object Database (VOD – high performance, high throughput, high concurrency) and Fast Objects (great for embedded applications). I’ve met and worked with some brilliant people who have created amazing products based on this technology. Creative people like these are fun to work with, and helping them grow their business is mutually beneficial. Everyone wins.

An area where VOD excels is with the near real-time processing of streaming data. The reason it is so adept to this task is the way that object map out in the database. They do so in a way that essentially mirrors reality. So, optionality is not a problem – no disjoint queries or missed data, no complex query gyrations to get the correct data set, etc. Things like sparse indexing are no problem with VOD. This means that pattern matching is quick and easy, as well as more traditional rule and look-up validation. Polymorphism allows objects, functions, and even data to have more than one form.

Image of globe with network of connected dots in the space above it.

VOD does more by allowing data to be more, which is ideal for environments where change is the norm. Cyber Security, Fraud Detection, Threat Detection, Logistics, and Heuristic Load Optimization. In each case, the key to success is performance, accuracy, and adaptability.  

The ubiquity of devices generating data today, combined with the desire for people and companies to leverage that data for commercial and non-commercial benefit, is very different than what we saw 10+ years ago. Products like VOD are working their way up that Slope of Enlightenment because there is a need to connect the dots better and faster – especially as the volume and variety of those dots increases. It is not a, “one size fits all” solution, but it is often the perfect tool for this type of work.

These are indeed exciting times!

Ideas are sometimes Slippery and Hard to Grasp

Posted on Updated on

I started this blog the goal of it becoming an “idea exchange,” as well a way to pass along lessons learned to help others. Typical guidance for a blog is to focus on one thing only and do it well in order to develop a following. That is especially important if you want to monetize the blog, but that is not and has not been my goal.

One of the things that has surprised me is how different the comments and likes are for each post. Feedback from the last post was even more diverse and surprising than usual. It ranged from comments about “Siri vs Google,” to feedback about Sci-Fi books and movies, to Artificial Intelligence.

I asked a few friends for feedback and received something very insightful (Thanks Jim). He stated that he found the blog interesting, but wasn’t sure what the objective was. He went on to identify several possible goals for the last post. Strangely enough, his comments mirrored the type of feedback that I received. That pointed out an area for improvement to me, and I appreciated that, as well as the wisdom of focusing on one thing. Who knows, maybe in the future…

This also reminded me of a white paper written 12-13 years ago by someone I used to work with. It was about how Bluetooth was going to be the “next big thing.” He had read an IEEE paper or something and saw potential for this new technology. His paper provided the example of your toaster and coffee maker communicating so that your breakfast would be ready when you walk into the kitchen in the morning.

At that time I had a couple of thoughts. Who cared about something that only had a 20-30 foot range when WiFi was becoming popular and had much greater range? In addition, a couple of years earlier I had a tour of the Microsoft “House of the Future,” in which everything was automated and key components communicated with each other. But everything in the house was all hardwired or used WiFi – not Bluetooth. It was easy to dismiss his assertion because it seemed to lack pragmatism, and the value of the idea was difficult to quantify given the use case provided.

Idea 2

Looking back now I view that white paper as having insight (if it were visionary he would have come out with the first Bluetooth speakers, or car interface, or even phone earpiece and gotten rich), but it failed to present use cases that were easy enough to understand yet different enough from what was available at the time to demonstrate the real value of the idea. His expression of idea was not tangible enough and therefore too slippery to be easily grasped and valued.

I’m a huge believer that good ideas sometimes originate where you least expect them. Often those ideas are incremental in nature – seemingly simple and sometimes borderline obvious, often building on some other idea or concept. An idea does not need to be unique in order to be important or valuable, but it does need to be presented in a way that is easy to understand the benefits, differentiation, and value. That is just good communication.

One of the things I miss most from when my consulting company was active was the interaction between a couple of key people (Jason and Peter) and myself. Those guys were very good at taking an idea and helping build it out. This worked well because we had some overlapping expertise and experiences as well as skills and perspectives that were more complementary in nature. That diversity increased the depth and breadth to our efforts to develop and extend those ideas by asking the tough questions early and ensuring that we could convince each other of the value.

Our discussions were creative and highly collaborative as well as a lot of fun. Each of us improved from them, and the outcome was usually something viable from a commercial perspective. As a growing and profitable small business you need to constantly innovate to differentiate yourself from your competition. Our discussions were driven as much by necessity as they were by intellectual curiosity, and I personally believe that this was part of the magic.

So, back to the last post. I view various technologies as building blocks. Some are foundational and others are complementary. To me, the key is not viewing those various technologies as competing with each other. Instead, I look for potential value created by integrating them with each other. That may not always possible and does not always lead to something better, but occasionally it does so to me it is a worthwhile exercise. With regard to voice technology, I do believe that we will see more, better and smarter applications of it – especially as realtime systems become more complex due to the use of an increasing number of specialized component systems and sensors.

While today’s smartphones would not pass the Turing Test or proposed alternatives, they are an improvement over more simplistic voice translation tools available just a few years ago. Advancement requires the tools to understand context in order to make inferences. This brings you closer to machine learning, and big data (when done right) significantly increases that potential.

Ultimately, this all leads back to Artificial Intelligence (at least in my mind). It’s a big leap from a simple voice translation tool to AI, but when viewed as building blocks it is not such a stretch.

Now think about creating an interface (API) that allows one smart device to communicate with another in a manner akin to the collaborative efforts described above with my old team. It’s not simply having a front-end device exchanging keywords or queries with a back-end device. Instead, it is two or more devices and/or systems having a “discussion” about what is being requested, looking at what each component “knows,” asking clarifying questions and making suggestions, and then finally taking that multi-dimensional understanding of the problem to determine what is really needed.

So, possibly not true AI, but a giant leap forward from what we have today. That would help turn the science fiction of the past into science fact in the near future. The better the understanding and inferences by the smart system, the better the results.

I also believe that the unintended consequences of these new smart systems is that as they become more human-like in their approach the more likely they will be to make errors like a human. Hopefully they will be able to back test recommendations to validate and minimize errors. If they are intelligent enough to monitor results and make suggestions about corrective actions when they determine that the recommendation is not having the optimal desired results would make them even “smarter.” Best of all there won’t be an ego creating a distortion filter on the results. Or maybe there will…

A lot of the building blocks required to create these new systems are available today. But, it takes both vision and insight to see that potential, translate ideas from slippery and abstract to tangible and purposeful, and then start building something really cool. As that happens we will see a paradigm shift in how we interact with computers and how they interact with us. That will lead us to the systematic integration that I wrote about in a big data / nanotechnology post.

So, what is the real objective of my blog? To get people thinking about things in a different way, to foster collaboration and partnerships between businesses and educational institutions in order to push the limits of technology, and to foster discussion about what others believe the future of computing and smart devices will look like. I’m confident that I will see these types of systems in my lifetime, and believe in the possibility of a lot of this occurring within the next decade.

What are your thoughts?