Recently I was helping one of my children research a topic for a school paper. She was doing well, but the results she was getting were overly broad. So, I taught her some “Google-Fu,” explaining how you can structure queries in ways that yield better results. She commented that the searches should be smarter than that, and I explained that sometimes the problem is that search engines look at your past searches and customize results as an attempt to appear smarter. Unfortunately, those results can be skewed and potentially lead someone in the wrong direction. It was a good reminder that getting the best results from search engines often requires a bit of skill and query planning.
Then the other day I saw this commercial from Motel 6 (“Gas Station Trouble”) where a man has problems getting good results from his smart phone. That reminded me of seeing someone speak to their phone, getting frustrated by the responses received. His questions went something like this: “Siri, I want to take my wife to dinner tonight, someplace that is not too far away, and not too late. And she likes to have a view while eating so please look for something with a nice view. Oh, and we don’t want Italian food because we just had that last night.” Just as amazing as the question being asked was watching him ask it over and over again in the exact same way, each time becoming even more frustrated. I asked myself, “Are smart phones making us dumber?” Instead of contemplating that question I began to think about what future smart interfaces would or could be like.
I grew up watching Sci-Fi computer interfaces like “Computer” on Star Trek (1966), “HAL” on 2001 : A Space Odyssey (1968), “KITT” from Knight Rider (1982), and “Samantha” from Her (2013). These interfaces had a few things in common: They responded to verbal commands; They were interactive – not just providing answers, but also asking qualifying questions and allowing for interrupts to drill-down or enhance the search (e.g., with pictures or questions that resembled verbal Venn diagrams); They often provided suggestions for alternate queries based on intuition. Despite having 50 years of science fiction examples we are still a long way off from realizing that goal. Like many new technologies, they were originally envisioned by science fiction writers long before they appeared in science.
There seems to be a spectrum of common beliefs about modern interfaces. On one end there are products that make visualization easy, facilitating understanding, refinement and drill-down of data sets. Tableau is a great example of this type of easy to use interface. At the other end of the spectrum the emphasis is on back-end systems – robust computer systems that digest huge volumes of data and return the results to complex queries within seconds. The Actian Analytics Platform is a great example of a powerful analytics platform. In reality, you really need both if you want to maximize the full potential of either.
But, there is so much more to be done. I predict that within the next 3 – 5 years we will see business and consumer examples that are closer to the verbal interfaces from those familiar Sci-Fi shows (albeit with limited capabilities and no flashing lights). Within the next 10 years I believe we will have computer interfaces that intuit our needs and facilitate generating the correct answers quickly and easily. While this is unlikely to be at the level of “The world’s first intelligent Operating System” envisioned in the movie “Her,” and probably won’t even be able to read lips like “HAL,” it should be much more like HAL and KITT than like Siri (from Apple) or Cortana (from Microsoft). Siri was groundbreaking consumer technology when it was introduced. Cortana seems to have taken a small leap ahead. While I have not mentioned Google Now, it is somewhat of a latecomer to this consumer smart interface party, and in my opinion is behind both Siri and Cortana.
So, what will this future smart interface do? It will need to be very powerful, harnessing a natural language interface on the front-end with an extremely flexible and robust analytics interface on the back-end. The language interface will need to take a standard question (in multiple languages and dialects) – just as if you were asking a person, deconstruct it using Natural Language Processing (NLP), and develop the proper query based on the available data. That is important but only gets you so far.
Data will come from many sources – things that we consider today with relational, object, and graph databases. There will be structured and unstructured data that must be joined and filtered quickly and accurately. In addition, context will be more important than ever. Pictures and videos could be scanned for facial recognition, location (via geotagging), and in the case of videos analyze speech. Relationships will be identified and inferred based on a variety of sources, using both data and metadata. Sensors will collect data from almost everything we do and (someday) wear, which will provide both content and context. The use of Stylometry will identify outside content likely related to the people involved in the query and provide further context about interests, activities, and even biases. This is how future interfaces will truly understand (not just interpret), intuit (so it can determine what you really want to know), and then present results that may be far more accurate than we are used to today. Because the interface is interactive in nature it will provide the ability to organize and analyze subsets of data quickly and easily.
So, where do I think that this technology will originate? I believe that it will be adapted from video game technology. Video games have consistently pushed the envelope over the years, helping drive the need for higher bandwidth I/O capabilities in devices and networks, better and faster graphics capabilities, and larger and faster storage (which ultimately led to flash memory and even Hadoop). Animation has become very lifelike and games are becoming more responsive to audio commands. It is not a stretch of the imagination to believe that this is where the next generation of smart interfaces will be found (instead of from the evolution of current smart interfaces).
Someday it may no longer be possible to “tweak” results through the use or omission of keywords, quotation marks, and flags. Additionally, it may no longer be necessary to understand special query languages (SQL, NoSQL, SPARQL, etc.) and syntax. We won’t have to worry as much about incorrect joins, spurious correlations and biased result sets. Instead, we will be given the answers we need – even if we don’t realize that this was what we needed in the first place. At that point computer systems may appear nearly omniscient.
When this happens parents will no longer need to teach their children “Google-Fu.” Those are going be interesting times indeed.
In my last post I discussed the importance of proper pricing for profitability and success. As most people know, you increase profitability by increasing revenue and/or decreasing costs. But, cost reduction doesn’t have to mean slashing headcount, wages, benefits, or other factors that could negatively affect morale and ultimately quality and customer satisfaction. There is often a better way.
The best businesses generally focus on repeatability, realizing that the more that you do something – anything, the better you should get at doing it. You develop a compelling selling story based on past successes, develop a solid reference base, and have identified the sweet spot from a pricing perspective. People keep buying what you are selling, and if your pricing is right there is money available at the end of the month to fund organic growth and operational efficiency efforts.
Finding ways to increase operational efficiency is the ideal way to reduce costs, but it does take time and effort to accomplish. Sometimes this is realized through increases in experience and skill. But, often optimization occurs through standardization and automation. Developing a system that works well, consistently applying it, measuring and analyzing the results, and then making changes to improve the process. An added benefit is that this approach increases quality as well, making your offering even more attractive.
Metrics should be collected at a “work package” level or lower (e.g., task level), which means they are related tasks at the lowest level that produce a discrete deliverable. This is a project management concept, and it works whether you are manufacturing something, building something, or creating something. This allows you to accurately create and validate cost and time estimates. And, when you are analyzing work at this level of detail it becomes easier to identify ways to simplify or automate the process.
When I had my company we leveraged this approach to win more business with competitive fixed price project bids that provided healthy profit margins for us while minimizing risk for our clients. Larger profit margins allowed us to fund ongoing employee training and education, fund innovation efforts, fund international expansion, and experiment with new things (products, technology, methodology, etc.) that were fun and often taught us something valuable. It was only possible because of our focus on doing everything as efficiently and effectively as possible, learning from everything we did– good and bad, and having a tangible way to measure and prove that we were constantly improving.
Think like a CEO, act like a COO, and measure like a CFO. Do this and make a real difference in your own business!
Lord William Thomson Kelvin was a pretty smart guy in the 1800’s. He didn’t get everything right (e.g., he supposedly stated, “X-rays will prove to be a hoax.”), but his success ratio was far better than most so he did have useful insight. I’m personally a fan of his quote, “If you can not measure it, you can not improve it.”
Business Intelligence (BI) systems can be very powerful, but only when they are embraced as a catalyst for change. What you often find in practice is that the systems are not actively used, or do not track the “right” metrics (i.e., those that provide insight into something important that you have the ability to adjust and impact the results), or provide the right information – only too late to make a difference.
The goal of any business is developing a profitable business model and then executing extremely well. So, you need to have something that people want, then need to be able to deliver high quality goods and/or services, and finally need to make sure that you can do that profitably (it’s amazing how many businesses fail to understand this last part). Developing a systematic approach that allows for repeatable success is important. Pricing at a level that is competitive and provides a healthy profit margin provides the means for growth and sustainability.
Every business is systemic in nature. Outputs from one area (such as a steady flow of qualified leads from Marketing) become inputs to another (Sales). Closed deals feed project teams, development teams, support teams, etc. Great jobs by those teams will generate referrals, expansion, and other growth – and the cycle continues. This is an important concept to understand because problems or deficiencies in one area can manifest themselves in other areas.
Next, understanding of cause and effect is important. For example, if your website is not getting traffic is it because of poor search engine optimization, or is it bad messaging and/or presentation? If people come to your website but don’t stay long do you know what they are doing? Some formatting is better for printing than reading on a screen (such as multi-column pages), so people tend to print and go. And, external links that do not open in a new window can hurt the “stickiness” of a website. Cause and effect is not always as simple as it would seem, but having data on as many areas as possible will help you understand which ones are really important.
When I had my company we gathered metrics on everything. We even had “efficiency factors” for every Consultant. That helped with estimating, pricing, and scheduling. We would break work down into repeatable components for estimating purposes. Over time we found that our estimates ranged between 4% under and 5% over the actual time required for nearly every work package within a project. This allowed us to fix bid projects to create confidence, and price at a level that was lean (we usually came-in about the middle of the pack from a price perspective, but the difference was that we could guarantee delivery for that price). More importantly, it allowed us to maintain a healthy profit margin that let us hire the best people, treat them well, invest in our business, and take some profit as well.
There are many standard metrics for all aspects of a business. Getting started can be as simple as creating some sample data based on estimates, “working the model” with that data, and seeing if this provides additional insight into business processes. Then ask, “When and where could I have made a change to positively impact the results?” Keep working and when you have something that seems to work gather some real data and re-work the model. You don’t need fancy dashboards (yet).
Within a few days it is often possible to identify the Key Performance Indicators (KPIs) that are most relevant for your business. Then, start consistently gathering data, systematically analyzing it, and present it in a way that is easy to understand and drill-into in a timely manner. To measure the right things really is to know.
In an earlier post I mentioned that one of the big benefits of geospatial technology is its ability to show connections between complex and often disparate data sets. As you work with Big Data you tend to see the value of these multi-layered and often multi-dimensional perspectives of a trend or event. While that can lead to incredible results, it can also lead to spurious correlations of data.
First, let me state that I am not a Data Scientist or Statistician, and there are definitely people far more expert on this topic than myself. But, if you are like the majority of companies out there experimenting with geospatial and big data it is likely that your company doesn’t have these experts on-staff. So, a little awareness, understanding, and caution can go a long way in this type of scenario.
Before we dig into that more, let’s think about what your goal is. Do you want to be able to identify and understand a particular trend (reinforcing actions and/or behavior), or do you want to understand what triggers a specific event (initiating a specific behavior). Both are important, but they are both different. My personal focus has been on identification of trends so that you can leverage or exploit them for commercial gain. While that may sound a big ominous, it is really what business is all about.
There is a common saying that goes, “Correlation does not imply causation.” A common example is that for a large fire you may see a large number of fire trucks. There is a correlation, but it does not imply that fire trucks cause fires. Now, extending this analogy, let’s assume that in a major city the probability of multi-tenant buildings starting on fire is relatively high. Since they are a big city, it is likely that most of those apartments or condos have WiFi hotspots. A spurious correlation would be to imply that WiFi hotspots cause fires.
As you can see, there is definitely potential to misunderstand the results of correlated data. More logical analysis would lead you to see the relationships between the type of building (multi-tenant residential housing) and technology (WiFi) or income (middle-class or higher). Taking the next step to understand the findings, rather than accepting them at face value, is very important.
Once you have what looks to be an interesting correlation there are many fun and interesting things you can do to validate, refine, or refute your hypothesis. It is likely that even without high-caliber data experts and specialists you will be able to identify correlations and trends that can provide you and your company with a competitive advantage. Don’t let the potential complexity become an excuse for not getting started, because as you can see above it is possible to gain insight and create value with a little effort and simple analysis.
Two years ago I was assigned some of the product management and product marketing work for a new version of a database product we were releasing. To me this was the trifecta of bad fortune. I didn’t mind product marketing but knew it took a lot of work to do it well. I didn’t feel that product management was a real challenge (I was so wrong here), and I really didn’t want to have anything to do with maps.
Boy, was I wrong in so many ways. I didn’t realize that real product management was just as much work as product marketing. And, I learned that spatial was far more than just maps. It was quite an eye-opening experience for me; one that turned out to be very valuable as well.
First, let me start by saying that I now have a huge appreciation for Cartography. I never realized how complex mapmaking really is, and how there just as much art as there is science (a lot like programming). Maps can be so much more than just simple drawings.
I had a great teacher when it came to geospatial – Tyler Mitchell (@spatialguru). He showed me the power of overlaying tabular business data with common spatial data (addresses, zip / postal codes, coordinates) and presenting the “conglomeration of data” in layers that made things easier to understand. People buy easy, so that is good in my book.
The more I thought about this technology – simple points, lines, and area combined with powerful functions, the more I began to think about other uses. I realized that you could use it to correlate very different data sets and graphically show relationships that would otherwise extremely difficult to make.
Think about having access to population data, demographic data, business and housing data, crime data, health / disease data, etc. Now, think about a simple and easy to use graphical dashboard that lets you overlay as many of those data sets as you wanted. Within seconds you see very specific clusters of data that is correlated geographically. Some data may only be granular to a zip code or city, but other data will allow you to identify patterns down to specific streets and neighborhoods. Just think of how something so simple can help you make decisions that are so much better. The interesting thing is how few businesses are really taking advantage of this cost-effective technology.
If that wasn’t enough, just think about location aware applications, the proliferation of smart devices that completely lend themselves to so many helpful and lucrative mobile applications. Even more than that, they make those devices more helpful and user friendly. Just think about how easy it is to find the nearest Indian restaurant when the thought of curry for lunch hits you. And these things are just the tip of the iceberg.
What a lucky day it was for me when I was assigned this work that I did not want. Little did I know that it would change the way that I think about so many things. That’s just the way things work out sometimes.