title

text

Andreas Scherbaum
Andreas Scherbaum Pivotal
10:30 17 March
22 мин

Introduction to Greenplum MPP Database

Overview of the architecture of Greenplum MPP (Massively Parallel Processing) database. Explain the internals of GPDB. Show how to configure and setup GPDB. How to distribute data effectively for MPP

VIDEO

Слайды

Другие доклады

  • Maksim Viharev
    Maksim Viharev Alytics
    45 мин

    Using PostgreSQL in automating contextual advertising by Alytics for near real-time processing of mixed OLTP/OLAP load

    In the data persistence layer, using PostgreSQL from the very start of development, we went all the way from a small cluster on a virtual machine to a multi-host system that provides near real-time processing of mixed OLTP/OLAP load. In this talk, I’m going to tell you about the main development stages of our analytical solution at the application and infrastructure levels, and describe the specifics of using PG that we encountered.

    VIDEO

  • Ivan Panchenko
    Ivan Panchenko Postgres Professional
    90 мин

    JSON, JSONB, JSQuery

    This tutorial is about various applied JSON usage patterns and the related PostgreSQL functionality. We will discuss data storage in the JSON format, retrieving, changing, and searching such data, JSON features for simple SQL queries, as well as using JSON in stored procedures in different languages. You’ll get hands-on experience with some of the discussed problems on the provided virtual machines.

  • Radoslav Glinsky
    Radoslav Glinsky Skype (Microsoft)
    45 мин

    Test environment on demand

    Do you test your PostgreSQL releases prior to Production in a dedicated test environment? Are you sure that your test environment (shortly Test) is equal to Production and in an appropriate state?

    In Skype we were facing multiple challenges associated with database testing:
    - Simplifying complex Production architecture of thousands of PostgreSQL instances, interconnected with RPCs and replications, infrastructure servers and external DB scripts, into their Test counterparts.
    - Constantly growing hardware requirements, insufficient cleanup of data generated in Test.
    - Differences between Test and Production were appearing and accumulating. Recognizing and fixing them required lots of effort.

  • Philip Delgyado
    Philip Delgyado ООО «Лектон»
    22 мин

    Distributed workflow specifics in PostrgeSQL

    When working with a complex business logic, you often have to implement a workflow - a sequence of several processing steps, with each step implementing a separate part of the business logic. This is usually done with specialized queues, but if there are high reliability demands, it makes sense to do everything on PostgreSQL.

    I will describe the tasks that require a workflow implementation, offer a solution, compare it with other options, and tell you about the implementation traps and pitfalls.