What can 3 world-class Michelin chefs teach us about Big Data?

Author: Jan-Kees Buenen

Heston Blumenthal. Grant Achatz. Joan Roca. Names that are on the list of the most awarded chefs in the world.

They have been leaders in the global gastronomic sector in the last years and are known for their innovative, emotional and modernist approaches. They are famous for fusing food and tech.

And what possibly links these master chefs with big data?

Firstly, they turn common ingredients (“raw data”) into stunning dishes (“using new unique insights”) with the help of science. They have proven that combining science, technology & human touch creates incredible results. One could even go further and say that the combination of these passionate minds and advanced technologies make un-imaginable results possible.

Look at Heston Blumenthal, the mad scientist and genius chef. He has great knowledge of his business and he’s completely devoted to his passion. What makes him so successful, however , is taking science into his routine. Unique insights are generated by putting together daily ingredients and high-tech utensils, like dehydrators and cartouches. Scientific tools enable him to leverage his creativity.

Secondly, they don’t work in a messy and dirty environment. Their staff is very organized. The kitchen and their labs are always clean. They only work with the best ingredients. They take utmost care of their mise-en-place to never miss any single ingredient when arranging the plate. It is this final touch, sometimes done in the blink of an eye, that brings together all those efforts that create a high-quality experience. “It’s an immense amount of work in a very strict, almost military-like, environment”, says Grant Achatz.

In Big Data we could well learn from those chefs. Cognitive analytics, artificial intelligence and deep learning are the scientific instruments to help create better information. However, insert messy or badly known data, have poor “data-kitchen” practices, operate the data preparation and data science stations at a too big distance from business reality and no science in the world will show good results. When you have a “spaghetti” of models and spreadsheets with messy data you very likely get “garbage in garbage out” results.

Finally, these master chefs had the courage to leave the well-trodden paths of cuisine and they never stop learning. “I’m always pushing the creative envelope and experimenting in the kitchen. But the key to becoming a top chef is taking time to master the fundamentals”, Joan Rocca asserts.

Big Data also requires business leaders to master the basics, push the creative envelope and continue experimenting. Data volumes and sources – sensor, speech, images, audio, video – will continue to grow. The volume and speed of data available from digital channels will continue to outpace manual decision-making. Business leaders, like master chefs, will need to leave the path that once made sense in order to succeed and stand-out in the data-driven world.

Signature dishes like Sound of the sea, Apple Ballloons and Pig Trotter Carpaccio delight so many of us. To achieve similar cutting-edge results with data, why not take your inspiration from chefs?

 

Read this article on LinkedIn: https://www.linkedin.com/pulse/what-can-3-world-class-michelin-chefs-teach-us-big-data-buenen/

Why 2019 will be all about End User Computing?

Author: Jan-Kees Buenen

As 2019 approaches, it’s time to look back at all the IT initiatives in that have happened and see what has shaped 2018.  End user computing, applications aimed to better integrate end users into the computing system environment, will surely definitely be high on the list.

With the pace of change in data monitoring and management in 2018, the world of End User Computing (EUC) looks a lot different than it did just one year ago. Developments in AI, cloud-based technology and analytics brought a new horizon to the scene. 

Some products were replaced, others simply disappeared and several innovations emerged to fill holes many IT pros and business leaders didn’t even know they had. Many of these new technologies were launched to allow businesses to have timely insights.

In recent years, employees couldn’t solve a task or a problem promptly simply because they were lacking in data. New technologies allow businesses and users to evolve exponentially in their daily analysis. Real-time decisions are now possible. Companies can quickly deploy solutions to adapt to changes in the most dynamic markets.

A lot of business leaders still lose sleep over some well-known challenges concerning EUC management. A quick search finds us hundreds of cases in 2018 of misstated financial statement, compliance violations, cybersecurity threats and audit findings. All issues resulting from breakdowns in EUC control.

EUC requirements for today’s enterprises are typically complex and costly. “Employees working from multiple devices and remotely represent a tremendous IT task to manage endpoints…”, said Jed Ayres in a blog post on Forbes, “… add to that constant operating system (OS) migration demands and increasing cost pressures, and EUC might be IT’s greatest challenge”,

2018 was also the year in which leading players from different (and unexpected!) sectors tried to help companies solve the challenges around EUC management. Solutions aimed at solving many of the traditional end user computing issues mentioned above. Some examples:

  • KPMG released a Global Insights Pulse survey that shows a growing number of finance organizations using emerging technologies to enable next-generation finance target operating models
  • Deloitte launched an enterprise-level program for managing EUCs. The initiative provides organizations with a framework for managing and controlling EUC holistically
  • Forrester, for the first time, released an evaluation of the top 12 unified endpoint management solutions available on the market

AND WHAT WILL HAPPEN IN 2019?

Given the increase in the regulation and security constraints, we can expect organizations to continue to face EUC challenges in 2019. It will be crucial for business leaders and IT pros to find solutions that manage, secure and optimize data if they want to succeed in the digital transformation value chain.

New holistic approaches and innovative technologies indicate an exciting year of 2019, with an increasing range of disruptive EUC solutions. Innovations deployed by leading players like Microsoft and Citrix and highly specialized companies and start-ups.

As we wind down 2018 and look ahead to a bright new year, yes, it will be all about end user computing.

Sometimes Too Good To Be True Blocks Disruptive Innovation

Author: Jan-Kees Buenen

Over the last few months we have had the opportunity to present our SynerScope Ixiwa solution to many prospect corporations and potential business partners. I have learned a lot from these conversations, both on how to make our offering even more user-friendly, as well as on which technical developments should be prioritized.

Disruptive technology

The most interesting insight however that I got from all these meetings directly results from having truly disruptive technology: generally people do not believe some of the things Ixiwa can do. They think it is too good to be true. For example, people do not believe that we can tag, analyze, categorize and match content of a data lake at a record level without having pre-created metadata. Other people struggle with believing that we can match structured and unstructured data without pre-defined search logic. They question if our patented many-to-many correlator truly can group content, elements or objects in logical clusters. Or they wonder if it is true that business users can easily add and use data-sets in the lake without IT support. Frankly stated: they think we oversell the capabilities of our platform.

Faster, high quality and value generating insights

So we learned we need to show what we can do as early as possible in the relationship; we demo on real data in a real data lake in a live environment so that our partners and clients can first hand experience how our technology works. And when we get to this point, people become enthusiastic and brainstorm about the projects they could do with our technology. Which of course is what we want; deploying our technology to help customers and partners get faster, high quality and value generating insights out of their data.

Demo our capabilities

So please do us a favor and give us a chance by allowing us to demo our capabilities, even if you don’t immediately believe our story. We promise we won’t disappoint you.

Insurtech: Winning The Battle But Losing The War?

Author: Monique Hesseling

It was exciting to visit InsurTech Connect last week; there was a great energy and optimism on what we collectively can innovate for our insurance industry and customers,  and it was wonderful to see insurers, technology providers of all sizes and maturity levels and capital providers from all around the world coming together and try to move our industry forward.  As somebody eloquently pointed out: it was insuretech meeting maturetech.

Having attended many insurance and technology conference over the years, I really saw progress around innovation this time: we have all spoken a lot about innovation for a long time, and built plans and strategies around it, but this time I saw actual workable and implementable technical applications, and many  carriers ready and able to implement, or at least  run a pilot.

All of this is wonderful, of course. The one thing however that struck me as a possible short term gain but a longer term disadvantage is the current focus on individual use cases and related point solutions.  Now, I do understand that it is easier to “sell” innovation in a large insurance company if it comes with a clear use case and tight business plan including an acceptable ROI, supported by a specific technology solution for this use case. I saw a lot of those last week, and the market seems to be ready to adopt and implement function specific analytics and operational solutions around claims, underwriting, customer service, customer engagement and risk management. Each one of these comes with a clear business case and often a great technical solution, specifically created to address this business case. Which short term leads to a lot of enthusiasm for insuretech and carrier innovation.

However, to truly innovate and benefit from everything new technologies bring to the table, innovation will have to start taking place at a more foundational level; to access and use all available data sources, structured and unstructured, and benefit fully from insights, carriers will have to create an enterprise wide big data infrastructure. Quickly accessing and analyzing all these new and different data types (think unstructured text, pictures or IoT) just cannot  be done in a classical data environment.  Now, creating a new data infrastructure is obviously a big undertaking. So I appreciate that carriers (and therefore tech firms) tend to focus first on smaller, specific use case driven, projects.  I fear however, that some insurance companies will end up with a portfolio of disconnected projects and new technologies, which will quickly lead to data integration and business analytics issues and heated discussions between the business and IT. I start seeing that happening already with some carriers I work with.  So I would suggest to focus on the “data plumbing” first: get buy-in for a new brave data world. Get support and funding for a relevant big data infrastructure, in incremental steps. Start with a small lake, for innovation. Run some of these use cases on this, and quickly scale up to an enterprise level. It is harder to get support and funding for “plumbing” than for a sexy use case around claims or underwriting, but it seems to be even harder to get the plumbing done when the focus is on individual business use cases. Please do not give up on the “plumbing”:  we might win the innovation battle, possibly lots of battles,  but lose the war.

Using Smart City Data for Catastrophe Management: It Still Is All About the Data …

Author: Monique Hessling

Recently I had the pleasure to work with SMA’s Mark Breading on a study about the development of smart cities and what this means for the insurance industry (Breading M, 2017. Smart cities and Insurance. Exploring the Implications. Strategy Meets Action).

Smart Cities and InsuranceIn this study Mark touched on a number of very relevant smart city related technical developments that impact the insurance industry, such as driverless cars, smart buildings, improved traffic management, energy reduction, or sensor-driven better controlled and monitored health and well-being.  Mark also explored how existing risks and how carriers assess them might change, and how these might be reduced due to new technologies.  However, new risks, especially around liability (think cyber, or who is to blame for errors made by technology in a driverless car or in automated traffic management) will evolve. Mark concluded that insurers will have to be ready to address these changes in their products, risk assessments and risk selection.

Here in the USA, the last weeks have shown again how much impact weather can have on our cities, lives, communities, businesses and assets. We all have seen the devastation in Texas, Florida, the Caribbean and other areas and felt the need to help.  As quickly as possible. Insurance carriers and their teams too go out of their way (often literary) to assist their clients in these overwhelmingly trying times. And I learned in working with some of them that insurers and their clients can benefit from “smart city technologies’ also in times of massive losses.

I have seen SynerScope and other technology being used to monitor wind and water levels, overlaying this data with insured risks and exposure data. Augmented with drone and satellite pictures and/or smart building and energy grid sensor data (part of this is often publicly available and open), this information gives a very quick first assessment of damage (property and some business interruption) to specific locations. I have seen satellite and drone pictures of exposures being machine analyzed, augmented with other data and deployed in an artificial intelligence/machine learning environment  that by using similarity analyses quickly identifies other insured exposures that most likely have incurred similar damages. This enables adjustors to  proactively get involved in addressing this potential claim, hopefully limiting damages and getting the insured back to normal as soon as possible. Another use of this application of course is fraud detection.

Smart City Data

Smart city projects  use technology to make daily life better for citizens, business and government. The (big) data these projects generate however can also be very helpful in dealing with a catastrophe and its aftermath. We don’t always think creatively about re-using our data for new purposes. Between carriers, governments and technology providers we should explore this more. To make our cities even smarter, also in bad times.

Download the report:

Smart Cities and Insurance

 

Data Lakes, Buckets and Coffee Cups

Author: Monique Hessling

Over the last years, primarily large carriers and especially the more “cutting edge” ones (for all the doubters: yes there is such a thing as a cutting edge insurer), have invested in building data lakes. The promise was that these lakes would enable them to more easily use and analyze “big data”, and gain insights that would change the way we all do business. Change our business for the better, of course. More efficient, better customer experiences, better products, lower costs. In my conversations with all kinds of carriers, I have learned that I am not the only one who struggles to totally grasp this concept:

A midsize carrier’s CIO in Europe informed me that his company was probably too small for a whole lake, and asked me if he could start with just a “data bucket”. His assumption was that many buckets ultimately would construe a lake. Another carrier’s CIO explained to me that she is the proud owner of a significant lake. It is just running pretty dry since she analyzes, categorizes and normalizes all data before dumping it in. She explained that she was filling a big lake with coffee cups full of data. It would take her a long time to get that lake filled..

You might notice that these comments all dealt with the plumbing of a big data infrastructure; the carriers did not touch on analytics and valuable insights yet. Let alone on operationalizing insights or measurable business value. Many carriers seem to be struggling with the classical pain-point of ETL, also in this new world.

By digging into this issue with big data SMEs , learned that this ETL issue is more a matter of perception than a technological problem. Data does not have to be analyzed and normalized before being dumped into lakes. And it can still be used for analytical exercises. Hadoop companies such as Hortonworks, Cloudera or MapR, or integrated cloud solutions such as the recently announced Deloitte/SAP HANA/AWS solution provide at least part of the solution to dive and snorkel in a lake without restricting oneself to tipping a toe in a bucket of very clean and much analyzed data.

And specialized firms such as SynerScope can prevent weeks, months or even longer of filling that lake with coffee cups full of clean data by providing capabilities to fill lakes with many different types of data fast (often within days) and at a low cost. Adding their capabilities in specialized deep machine learning to these big data initiatives allows for secure, traceable and access controlled use of “messy data” and creates quick business value.

Now, for all of us data geeks, it feels very uncomfortable to work with, or enable others to work with data that has not been vetted at all. But we’ll have to accept that with the influx of the massive amounts of disparate data sources carriers want to use, it will become more and more cost and time prohibitive to check, validate and control every piece of data being used by our businesses at point of intake into the lake. Isn’t it much smarter to take a close look at data at the point where we actually use it? Shifting our thinking that way, coupled with technology available, will enable much faster value out of our big data initiatives. I appreciate that this creates a huge shift in how most of us have learned to deal with data management. However, sometimes our historical truths need to be thrown overboard and into the lake before we can sail to a brighter future.

Dataworks Summit Munich and Dreams Coming True

Author: Monique Hesseling

Last week the SynerScope team attended the Dataworks Summit in Munich: “the industry’s premier big data community event”. It was a successful and well-attended event. Attendees were passionate about big data and its applicability to different industries. The more technical people learned (or in the case of our CTO and CEO: demonstrated) how to get most value quickly out of data lakes. Business folks were more interested in sessions and demonstrations on how to get actionable insights out of big data, use cases and KPIs. Most attendees came from the EMEA region, although I regularly detected American accents also.

It has been a couple of years since I last attended a Hadoop/big data event -I believe it was 2013- and it was interesting last week to see the field maturing. Only a few years ago, solution providers and sessions focused primarily on educating attendees on the specifics of Hadoop, data lakes, definitions of big data and theoretical use cases: “wouldn’t it be nice if we could..”. Those days are gone. Already in 2015, Betsy Burton from Gartner discussed in her report “Hype Cycle for Emerging Technologies ”  that big data quickly had moved through  the hype cycle and had become a megatrend, touching on many technologies and ways of automation. This became obvious in this year’s Dataworks Summit. Technical folks questioned how to quickly give their business counterparts access and control over big data driven analytics. Access control, data privacy and multi-tenancy were key topics in many conversations. Cloud versus local still came up, although the consensus seemed to be that cloud is becoming unavoidable, with some companies and industries adopting faster than others. Business people inquired about use cases and implementation successes. Many questions dealt with text analysis, although a fair number of people wanted to discuss voice analysis capabilities and options, especially for call center processes. SynerScope’s AI/machine learning case study of machine-aided tagging and identifying pictures of museum artifacts also got a lot of interest. Most business people however had a difficult time coming up with business cases in their own organizations benefitting from this capability.

This leads me to an observation that was made in some general sessions also: IT and technical people tend to see Hadoop/data lake/big data initiatives as a holistic undertaking, creating opportunities for all sorts of use cases in the enterprise. Business people tend to run their innovation by narrowly defined business cases, which forces them to limit the scope to a specific use case. This makes it difficult to justify and get funding for big data initiatives beyond pilot phases. We probably all would benefit if both business and IT would consider big data initiatives holistically at the enterprise level.  As was so eloquently stated in Thursday’s general session panel: “Be brave! The business needs to think bigger. Big Data addresses big issues. Find your dream projects”!  I thought it was a great message, and it must be rewarding for everybody working in the field that we can start helping people with their dream projects. I know that at SynerScope we get energized by listening to our clients’ wishes and dreams and making these into realities. There still is a lot of work to be done to fully mature big data and big insights, and make dreams come true, but we all came a long way since 2013.  I am sure the next step on this journey to maturity will be equally exciting and rewarding.

Artificial Intelligence ready for “aincient” cultures?

Author: Annelieke Nagel

 

Google, Aincient.org, Synerscope and the Dutch National Museum of Antiquities are creating a revolutionary acceleration in antiquities research

Last Monday I was present at the launch of a fantastic initiative for Egyptian art lovers around the world! A more apt setting was not possible as the presentations were organized in front of the Temple of Taffeh, an ancient Egyptian temple built by order of the Roman emperor Augustus.

Egyptologist Heleen Wilbrink, founder of Aincient.org, Andre Hoekzema, Google country manager Benelux and Jan-Kees Buenen, CEO SynerScope were the presenters that afternoon.

Aincient.org is the driving force behind this pilot project. Thus all presentations were geared towards explaining the need for protection of the world heritage through digitally capturing the art treasures and even more importantly, being able to research them and accelerate discoveries by merging all data sources.  To secure the progress of this kind of research, it also depends on support of outside funds. (If you are interested, please go to www.aincient.org for further information)

The current online collection of the Dutch National Museum of Antiquities (Rijksmuseum van Oudheden (RMO)), consists of around 57,000 items and can now be searched within hours, in a way previously not possible, thanks to SynerScope’s powerful software built on top of Google Cloud Vision API.

The more in-depth technical explanation of the software and partnerships involved, was compelling as it linked Artificial Intelligence and deep learning together with artifacts and an open mind, in order to make this project possible.

This unique pilot program needed to unlock all data available (text, graphs, photos/video, geo, numbers, audio, IoT, biomed, sensors, social) easy and very fast!

The large group of objects (60,000 in this instance but the RMO has another 110,000 more to do) from various siloed databases was categorized and brought together into SynerScope’s data visualisation software: images and texts simultaneously available, linked to a time and location indicator. The system indicates the metadata and descriptions certain items have in common, and the similarities in appearances.

As CEO Jan-Kees Buenen put it: “At SynerScope, we offer quick solutions to develop difficult-to-link data and databases, making them comprehensible and usable”.

Through Aincient.org the RMO online collection can be linked to external databases from other museums around the world. Thus it generated a lot of interest from museums like Teylers Museum Haarlem, Stedelijk Museum Amsterdam and Foundation Digital Heritage (Stichting DEN). They were all present to absorb the state-of-the art information that was presented. Interestingly enough some Egyptologists present expressed their slight scepticism to embrace this new technology to unlock the ancient culture.

We will soon notice that the outcome of the researched data will be used as a source of inspiration for new exposition topics, and I am sure it will also progressively serve the worldwide research community.

I believe this latest technology is the future of the past!

Innovation in action: Horses, doghouses and winter time…

Author: Monique Hesselink

During a recent long flight from Europe, I read up on my insurance trade publications. And although I now know an awful lot more about block chain, data security, cloud, big data and IoT than when I boarded in Frankfurt, I felt unsatisfied by my readings (for the frequent flyers; yes, the airline food might have had something to do with that feeling). I missed real live case studies, examples of all this new technology in action in normal insurance processes, or integration into down-to-earth daily insurer practices. Maybe not always very disruptive, but at least pragmatic and immediately adding value.I know the examples I was looking for are out there, so I got together with a couple of insurance and technology friends and we had a great time identifying and discussing them. For example, the SynerScope team in the Netherlands told me that their exploratory analysis on unstructured data (handwritten notes in claims files, pictures) demonstrated  that an unexplained uptick in home owners claims was caused by events involving horses. Now think about this for a moment: in the classical way of analyzing loss causes we start with a hypothesis and then either verify or falsify that. Honestly, even in my homeland I do not think that any data analyst or actuary would create a hypothesis that horses would be responsible for an uptick in home owners losses. And obviously “damage caused by horse” is not a loss category on the structured claims input either, under home owners coverage. So up to not too long ago, this loss cause either would not have been recognized as significant, or it would have taken analysts enormous amount of time and a lot of luck identifying it by sifting through mass amounts of unstructured data. The SynerScope team figured it out with one person in a couple of days. Machine augmented learning can create very practical insights.

In our talks, we discovered these type of examples all over the world; here in the USA, a former regional executive at a large carrier told me that she found an uptick in house fires in the winter in the South. One would assume that people mistakenly set their house on fire in the winter with fireplaces, electrical heaters etc to stay warm. Although that is true, a significant part of the house fires in rural areas was caused by people putting heating lamps in dog houses: to keep Fido warm. Bad idea.. Again; there was no loss code for “heating lamp in doghouse” in structured claims reporting processes, nor was it a hypothesis that analysts thought to pose. So it took the trending of  loss data over years before the carrier noticed this risk and took action to prevent and mitigate these dreadful losses. Exploratory analysis on unstructured claims file information in a deep machine learning environment, augmented with domain expertise and a human eye -as in the horse example I mentioned earlier- would have identified this risk much faster. We went on and on about case studies like those..

Now, although I am a great believer and firm supporter of healthy disruption in our industry, I think we can support innovation by assisting our carriers with these kind of very practical use cases and value propositions. We might want to focus on practical applications that can be supported by business cases, augmented with some less business case driven innovation and experimenting. I firmly believe that a true partnership between carriers, instech firms and distribution channels and a focus on innovation around real-life use cases will allow for fast incremental innovation and will keep everybody enthusiastic about the opportunities of the many new and exciting technologies available. While doing what we are meant to do; protecting homes, horses and human lives.

First Time Right

First time right: sending the right qualified engineer to the address of installation.

To ensure a continuous and reliable power supply to households in the coming years, energy providers are replacing old meters with new smart meters. The old, traditional meter is not prepared for the smart future and not suitable for new services and applications that help reduce energy consumption.

A wide range of meters of different ages are currently used in houses across the country. Some of these are too old or too dangerous which means only engineers holding special certificates can do the exchange to the new smart meters.  Currently it is guess-work what type of meter is to be replaced upon arrival at the address of installation. And so it happens too often that an engineer has to leave empty-handed as he is unable to carry out the planned job. This means the resident must be present to open the door twice, which is very inconvenient.  The big question is: how to send the right qualified engineer first time round?

For inventory reasons, energy companies started to ask their engineers to take photographs of the meters they repaired or exchanged during these last years. Over the years pictures of meter boxes in all shapes and sizes were gathered. SynerScope is able to take these pictures and add relevant data available from open sources like information on location of homes, date of construction and pictures of neighborhoods. This way SynerScope creates profiles of where a certain type of meter box can be found. As not all meter boxes are documented it is now possible, based on these created profiles, to make the right prediction about the type of meter in a certain home that needs to be replaced. Thus sending the right engineer first time round, leading to happy faces for both the resident and the energy company.