Einladung zum September-MeetUp am 22.09.2016 in Karlsruhe
Einladung zum September-MeetUp am 22.09.2016 in Karlsruhe https://digital-identity.io/wp-content/themes/corpus/images/empty/thumbnail.jpg 150 150 WidasConcepts GmbH WidasConcepts GmbH https://digital-identity.io/wp-content/themes/corpus/images/empty/thumbnail.jpgGut erholt und voller Energie starten wir die finale Runde der Big Data User Group! Unser letztes Treffen in Karlsruhe in diesem Jahr findet am 22. September 2016 statt. Wir laden Euch herzlich ein, mit uns ab 18:30 Uhr in der FSSV Waldgaststätte (Adenauerring 36, Karlsruhe) einen interessanten, informativen Big Data-Abend zu verbringen. Folgende Themen werden serviert:
Apache Kafka 101 (Florian Troßbach, codecentric AG)
Apache Kafka ist ein verteilter Message Broker, der nach dem Publish-Subscribe-Prinzip arbeitet. Durch seine spezielle Architektur kann ein Kafka-Cluster immensen Durchsatz bei sehr niedriger Latenz erreichen – und obendrein noch hervorragend horizontal skalieren. Florian Troßbach erläutert die Grundprinzipien der Architektur und demonstriert, wie Clients mit Kafka integrieren können.
Zur Person:
Florian Troßbach arbeitet als IT-Consultant bei der codecentric AG. Seine Wurzeln hat er in der klassischen Java-Enterprise-Entwicklung. Mittlerweile gilt sein Hauptaugenmerk jedoch dem Themenbereich „Fast Data“ und dem SMACK Stack.
——-
Datenmanagement for Stream Processing – and all the rest (Dr. Harald Weber-Gottschick, 1&1 Internet AG)
On the one hand there are promises: The promise of Big Data is based on analytics of a broad variety of data gathered from a many different sources. The promise of many different data analytics tool vendors to do specialized analytics for many different use-cases (requiring specialized aggregation of those data.) A specialized class of applications even promises to base automated decisions on near real-time analytics, e.g. in the security domain.
On the other hand there is the reality: Grown data management architectures often implement many different solutions for the Analytics Value Chain depending on different technologies very much driven by the demand which existed at the introduction of a new data analytics tooling. Every functional domain implemented their own solution resulting in data silos. Data Management is based on file transfers, e.g. of log files, resulting in long lead times until a decision can be made.
The talk presents a sketch for a consistent architecture to allow the promises to come true by altering the reality. It also discusses some learnings from introducing first building blocks.
Zur Person:
Dr. Harald Weber-Gottschick is Business Architect for the Mail & Media subsidiary of 1&1. As such he is responsible for the overall logical architecture of such brands as WEB.DE, GMX and mail.com. Besides other responsibilities he is concerned with an uplift of the data management systems towards modern Big Data and Stream Processing approaches.
——-