Posts

Solving Cybercrime at Scale and in Realtime

In a recent event organized by Hortonworks, SynerScope and Inter Visual Systems, we discussed using data technologies to solve cybercrime in scale and realtime.


Solving Cybercrime at Scale and in Realtime
Information security is a big problem today. With more attacks happening all the time, and increasingly sophisticated attacks beyond the script-kiddies of yesterday, patrolling the borders of our networks, and controlling threats both from outside and within is becoming harder. We cannot rely on endpoint protection for a few thousand PCs and servers anymore, but as connected cars, internet of things, and mobile devices become more common, so the attack surface broadens. To face these problems, we need technologies that go beyond the traditional SIEM, which human operators writing rules. We need to use the power of the Hadoop ecosystem to find new patterns, machine learning to uncover subtle signals and big data tools to help humans analysts work better and faster to meet these new threats. Apache Metron is a platform on top of Hadoop that meets these needs. Here we will look at the platform in action, and how to use it to trace a real world complex threat, and how it compares to traditional approaches. Come and see how to make your SOC more effective with automated evidence gathering, Hadoop-powered integration, and real-time detection.
Speaker:
Simon Elliston Ball, Director Product Management, Cyber Security, Hortonworks

 


Advantage of Central Security Data Lake: 

Cyber Security teams are keen on not only finding threats, but also understanding them. By putting all relevant data out of the silo’ed individual systems and into a central security data lake SynerScope greatly enhances the productivity of the Security Operation Center. The SOC is provided with operationally relevant information on as-it-happens events, as well as given the ability to hunt and discover their unknown risks within their enterprise. SynerScope Ixiwa is used to orchestrate and correlate the data, and SynerScope Iximeer is used for human-in-the loop viewing, understanding and collaboration. This combination greatly speeds up attaching new sources, reducing time to resolution and enhancing the way findings are shared within the SOC.

Speaker:
Jorik Blaas, CTO, SynerScope

 


Secure data transmission in control room environments

Data is a major asset of any organization. Not only for commercial companies, but also for government institutions and other types of organizations, the vast amount of images, video, and data needs to be distributed throughout the organization in a fast and easy way. Control rooms are typically the central intelligence hubs of all information. However, the actual needs of the control room are not limited to the personnel within this room. It is the nerve center to communicate and collaborate with everybody involved. Stakeholders, wherever they are located, expect complete and swift communication about any possible issue and real-time status overviews. The vision of Inter Visual Systems is to offer an solution to distributes data throughout the complete organization to the right location in a fast, easy and secure way. It is even possible to share information between different secured private networks.

Speaker:
Harry Witlox, Project Manager, Inter Visual Systems

In cooperation with:

Webinar SynerScope with Hortonworks

Do Insurers Spend Too Much Time Understanding Data vs. Finding Value In It?

Recorded Tuesday, April 25, 2017  |  57 minutes

Every insurance company regardless of line of business is focused on being more data-centric. Risk assessments based data is at the heart of analysis. Understanding and paying valid claims quickly is key to customer retention and loyalty. Creating new insurance offerings to meet market and customer demands is imperative to remain relevant.

Today insurance companies have more data available to them than ever before. Whether it is big data from open data sets, IoT data, customer behavior data, photos from risk assessments, drone aerial inspections of property or traditional risk/claim/customer data. It’s all data! The challenge remains “how quickly can you ask questions of the data and make insightful business decisions”?

During this webinar you will learn how to:

  • Spend little or no time on data hygiene and data transformation
  • Make data accessible across the enterprise through data usage and collaboration
  • Quickly identify what new open data or existing data is most valuable for your risk assessments
  • Leverage deep learning on the underwriting and claims processes for a positive impact to your combined ratio
Speakers
  • Cindy Maike, GM of Insurance Solutions at Hortonworks
  • Monique Hesseling, Strategic Business Advisor at SynerScope
  • Pieter Stolk, VP of Customer Engagement at SynerScope

Data Lakes, Buckets and Coffee Cups

Author: Monique Hessling

Over the last years, primarily large carriers and especially the more “cutting edge” ones (for all the doubters: yes there is such a thing as a cutting edge insurer), have invested in building data lakes. The promise was that these lakes would enable them to more easily use and analyze “big data”, and gain insights that would change the way we all do business. Change our business for the better, of course. More efficient, better customer experiences, better products, lower costs. In my conversations with all kinds of carriers, I have learned that I am not the only one who struggles to totally grasp this concept:

A midsize carrier’s CIO in Europe informed me that his company was probably too small for a whole lake, and asked me if he could start with just a “data bucket”. His assumption was that many buckets ultimately would construe a lake. Another carrier’s CIO explained to me that she is the proud owner of a significant lake. It is just running pretty dry since she analyzes, categorizes and normalizes all data before dumping it in. She explained that she was filling a big lake with coffee cups full of data. It would take her a long time to get that lake filled..

You might notice that these comments all dealt with the plumbing of a big data infrastructure; the carriers did not touch on analytics and valuable insights yet. Let alone on operationalizing insights or measurable business value. Many carriers seem to be struggling with the classical pain-point of ETL, also in this new world.

By digging into this issue with big data SMEs , learned that this ETL issue is more a matter of perception than a technological problem. Data does not have to be analyzed and normalized before being dumped into lakes. And it can still be used for analytical exercises. Hadoop companies such as Hortonworks, Cloudera or MapR, or integrated cloud solutions such as the recently announced Deloitte/SAP HANA/AWS solution provide at least part of the solution to dive and snorkel in a lake without restricting oneself to tipping a toe in a bucket of very clean and much analyzed data.

And specialized firms such as SynerScope can prevent weeks, months or even longer of filling that lake with coffee cups full of clean data by providing capabilities to fill lakes with many different types of data fast (often within days) and at a low cost. Adding their capabilities in specialized deep machine learning to these big data initiatives allows for secure, traceable and access controlled use of “messy data” and creates quick business value.

Now, for all of us data geeks, it feels very uncomfortable to work with, or enable others to work with data that has not been vetted at all. But we’ll have to accept that with the influx of the massive amounts of disparate data sources carriers want to use, it will become more and more cost and time prohibitive to check, validate and control every piece of data being used by our businesses at point of intake into the lake. Isn’t it much smarter to take a close look at data at the point where we actually use it? Shifting our thinking that way, coupled with technology available, will enable much faster value out of our big data initiatives. I appreciate that this creates a huge shift in how most of us have learned to deal with data management. However, sometimes our historical truths need to be thrown overboard and into the lake before we can sail to a brighter future.

SynerScope announces Gold ISV partner status for Hortonworks

January 16, 2017, The Netherlands – SynerScope, the Big Data Analytics innovator, today announced its GOLD ISV status for Hortonworks Data Platform (HDP) and Hortonworks Data Flow (HDF). Hortonworks, a leading innovator of open and connected data platforms has certified SynerScope’s solution and expertise for the Insurance Industry.  As a member of the Hortonworks Partnerworks community, SynerScope is able to develop, test, certify, deploy and support joint solutions as well as gain access to technical and marketing resources.

Jan-Kees Buenen, CEO of SynerScope, said:  “Insurance companies often need to act really quickly due to suddenly changing circumstances. This requires fast decision-making based on current data.  Customers are keen to find and fully understand data patterns to enable insight and action that directly impact business objectives.  We believe our partnership with Hortonworks will give the insurance industry instruments for real-time, ultra-fast decision making by domain experts for continuous improvement.”

The insurance industry is undergoing a radical transformation and regulators are demanding greater focus on solvability. The Hortonworks certified SynerScope solution enables customers to reduce costs and risks, improve efficiency, generate new revenue, drive growth and comply with regulations. SynerScope has linked its advanced high speed visualization technology to HDP. This combination provides machine learning and advanced analysis but also helps avoid creating any specific database lock-ins.  NoSQL, Search and in-memory SQL integrated databases complete the technology stack that allows insurance carriers and other enterprises to become data driven in strategic analysis and operations.

Cindy Maike, general manager for insurance at Hortonworks, commented: “Our customers want insights delivered at the speed of business. Synerscope accelerates customer success with reduced implementation times with its insurance industry knowledge and expertise.”

 

 

IBM & Hortonworks collaboration

Imagine the excitement here at SynerScope when Hortonworks and IBM announced their collaboration to offer Open Source Distribution on Power Systems. We are working both with the Hortonworks Data Platform (HDP) and the Power8 with it’s 2 X 12 Core and 3GHz CPU, 2 X NVidia K80 GPU and 1 TB of memory(!) Besides the obvious excitement from a technical point of view (high geek alert), this new partnership will enable us to better serve our customers.

For enterprise users running POWER8-based systems, the first microprocessor designed for big data and analytics, Hortonworks provides a new distribution option for selecting a cost-effective platform for running their big data and analytics workloads. This open source Hadoop and Spark distribution will complement the performance of Power Systems by allowing clients to quickly gain business insights from their structured and unstructured data. Adding SynerScope pushes the efficiency and impact of this new combination getting to insights from data even faster. With SynerScope they get an All in 1 solution: flexible, user-friendly, visual and at scale. It is not only about finding patterns, but about understanding them by bringing all data sources together in one single visual environment.

We will be the first working with the new Hortonworks-Power8 combination and we will keep you posted about solution launches with Hortonworks HDP and Power8 .

 

Synerscope is now certified partner with Hortonworks

We are thrilled to announce the certification of SynerScope on Hortonworks data platform! As of now Hadoop based data can be accessed directly from SynerScope.