title

text

February 03 – 05 , 2020

PgConf.Russia 2020

PgConf.Russia 2020

PGConf.Russia is a leading Russian PostgreSQL international conference, annually taking together more than 700 PostgreSQL professionals from Russia and other countries — core and software developers, DBAs and IT-managers. The 3-day program includes training workshops presented by leading PostgreSQL experts, more than 40 talks, panel discussions and a lightning talk session.

Thems

  • PostgreSQL at the cutting edge of technology: big data, internet of things, blockchain
  • New features in PostgreSQL and around: PostgreSQL ecosystem development
  • PostgreSQL in business software applications: system architecture, migration issues and operating experience
  • Integration of PostgreSQL to 1C, GIS and other software application systems.
  • more than
    0 participants
  • 0 speakers
  • 0
    minutes of conversation
  • 62 talks
  • offline
    format

Talks

Talks archive

PgConf.Russia 2020
  • Олег Правдин
    Олег Правдин Lingualeo

    A brief story how MySQL → PG migration could increase company efficiency tenfold times:

    1. Program code has been reduced 50 times, with optimization of backend team (from 15 to 3 engineers)
    2. Software development of new features has become measuring in days, not in months
    3. Infrastructure costs per 1M users have been reduced 20 times
    4. Database structure and technical documentation were simplified significantly, from 100K high-dependent tables to just 20 simple tables
    5. New security level because of total forbidden on external SQL commands to the database
    6. Quick analytics aggregation on multiple parameters, without external analytics systems
    7. The last, but not the least: the main business was keeping alive during migration

  • Premnath j
    Premnath j CSG Systems International Pvt Ltd.
    Abhinav M
    Abhinav M CSG Systems International Pvt Ltd

    Many businesses which use Database management systems like Oracle, DB2 & MS SQL are unreliable these days. Moreover, the costs incurred in maintaining these systems and its product licenses keeps on increasing. As the competitors are migrating over to the new technologies and tools available in the market, it is necessary for these businesses to migrate to new environment which is efficient, consistent and reliable to stay in the market and the technologies used in the current environment have become obsolete or no longer serve the business purpose. PostgreSQL has emerged as a top open-source RDBMS software. Since there is no licensing cost associated with it most of the companies are planning to migrate the databases which are currently running on other RDBMS like Oracle, DB2, MS SQL server to PostgreSQL. This report summarizes the various methodologies, procedures and techniques involved in successfully migrating the data from Oracle to PostgreSQL & DB2 to PostgreSQL. Migration is not a simple effort there should be proper planning and testing involved in this right from database connectivity to performance analysis. In this paper we are going to cover most of the steps which we need to consider before the migration and after the migration like choosing the correct tools for implementing the migration, time taken to migrate ,data compatibility, code conversion, application connectivity to database, database configuration parameters, performance analysis, replication setups, database monitoring, patching and backup strategies.

  • Николай Аверин
    Николай Аверин Miro

    pg_repack is one of the most popular instruments for removing bloat of tables and indexes in Postgres. In most cases, it works perfectly. But if you use such a feature of Postgres as deferred constraints, using pg_repack becomes more difficult or even impossible. I will talk about how we encountered the problem and will describe some workarounds - from internal instruments of Postgres to a small patch for pg_repack.

  • Egor Rogov
    Egor Rogov PostgresPro

    To build a decent query plan, the optimizer has to understand statistical characteristics of underlying data. It is interesting to observe how the structure of the collected information became more complicated over time: what the optimizer relied on back in its early days and what is at his disposal now with the release of the 12th version. We will also talk about how and when statistics are collected, how to manage this process and whether it is necessary to think about it at all.

All talks

Partners

PgConf.Russia 2020

Organizational

Informational

Technical

Partner