\n \u003Cinclude relativeToChangelogFile=\"true\" file=\"install/all/procedures.xml\"/>\n ...\n \u003Cinclude relativeToChangelogFile=\"true\" file=\"changes/all/table_add_column_xyz.xml\"/>\n \u003Cinclude relativeToChangelogFile=\"true\" file=\"changes/${projectname.branch}/adjust_procedure_asd.xml\"/>\n ...\n\u003C/databaseChangeLog>\n",[724,1469,1470,1475,1480,1485,1490,1495,1500,1505,1510,1515,1519],{"__ignoreMap":280},[727,1471,1472],{"class":729,"line":730},[727,1473,1474],{},"\u003C?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n",[727,1476,1477],{"class":729,"line":281},[727,1478,1479],{},"\u003CdatabaseChangeLog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog\"\n",[727,1481,1482],{"class":729,"line":286},[727,1483,1484],{}," xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n",[727,1486,1487],{"class":729,"line":1149},[727,1488,1489],{}," xsi:schemaLocation=\"http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd\">\n",[727,1491,1492],{"class":729,"line":1155},[727,1493,1494],{}," \u003Cinclude relativeToChangelogFile=\"true\" file=\"install/all/tables.xml\"/>\n",[727,1496,1497],{"class":729,"line":1161},[727,1498,1499],{}," \u003Cinclude relativeToChangelogFile=\"true\" file=\"install/all/procedures.xml\"/>\n",[727,1501,1502],{"class":729,"line":1167},[727,1503,1504],{}," ...\n",[727,1506,1507],{"class":729,"line":1173},[727,1508,1509],{}," \u003Cinclude relativeToChangelogFile=\"true\" file=\"changes/all/table_add_column_xyz.xml\"/>\n",[727,1511,1512],{"class":729,"line":1179},[727,1513,1514],{}," \u003Cinclude relativeToChangelogFile=\"true\" file=\"changes/${projectname.branch}/adjust_procedure_asd.xml\"/>\n",[727,1516,1517],{"class":729,"line":1185},[727,1518,1504],{},[727,1520,1521],{"class":729,"line":1191},[727,1522,1523],{},"\u003C/databaseChangeLog>\n",[19,1525,1526,1527,1532],{},"As you can see, we have used the ",[85,1528,1529],{},[124,1530,1531],{},"${projectname.branch}"," placeholder in the path of a changelog. The file that is\nreferenced there, has to be added for each of the branches, because this changelog is also used for every branch. This\ncan be somewhat inconvenient in some times, when you only have to add a change to one of the branches, but that should\nnot happen that often. It’s more likely (at least for our case) that you have to adjust the same thing for all branches,\nbut a little differnt, or fill some table with different data.",[19,1534,1535],{},"Also, the right execution order of the scripts is secured this way. Furthermore, we don’t have to create and update one\nchangelog for every branch, where it can easily happen, that one file is left out and it goes through unnoticed. In our\nsetup, if you forget to add a file that’s declared in the changelog, that’s another case, because you will know it as\nsoon as you execute the script for the specific branch. So we considered this to be the best method to address multiple\nbranches.",[19,1537,1538,1539,1542,1543,1546],{},"You can also use the placeholder in other places, like the ",[124,1540,1541],{},"loadUpdateData"," tag, where you can specify a .csv file from\nwhich liquibase will load data. There, You’ll only need to add the changelog to the ‘",[124,1544,1545],{},"all","‘ folder and the .csv files in\neach branch folder. Furthermore, we are",[77,1548,1550],{"id":1549},"maven-profiles","maven profiles",[19,1552,1553],{},"To configure and execute liquibase, we use different maven profiles. We need to specify the url, username and password\nfor each server, so we have one profile for each of them. The properties that are the same based on the environment (\ntest, stage, prod), are defined in a config file included from the pom (as already seen above), so we also need to add a\nproperty for the environment in each profile. Like this we can create a liquibase profile for each application of an\nenvironment of a branch (yup, there are quite some profiles because of this, but it is simply needed – you don’t have to\nkeep them in your settings.xml all the time, though, so it isn’t that much of a pain, once they are created 😛 ). By\nsetting the username and password locally in the maven settings.xml, we also keep sure that no passwords are commited in\nour version control.",[19,1555,1556],{},"example profile:",[718,1558,1560],{"className":1127,"code":1559,"language":1129,"meta":280,"style":280},"\n \u003Cprofile>\n \u003Cid>xyz-test\u003C/id>\n \u003Cproperties>\n \u003Cprojectname.branch>xyz\u003C/projectname.branch>\n \u003Cprojectname.environment>test\u003C/projectname.environment>\n \u003Cprojectname.dbName>dbname\u003C/projectname.dbName>\n \u003Cprojectname.liquibase.url>jdbc:oracle:thin:@192.168.224.234:1521:DBID\u003C/projectname.liquibase.url>\n \u003Cprojectname.liquibase.username>username\u003C/projectname.liquibase.username>\n \u003Cprojectname.liquibase.password>password\u003C/projectname.liquibase.password>\n \u003Cprojectname.liquibase.schemaName>schema\u003C/projectname.liquibase.schemaName>\n \u003Cprojectname.liquibase.changeLogFile>target/classes/path/to/changelog/db.changelog.xml\u003C/projectname.liquibase.changeLogFile>\n \u003C/properties>\n \u003C/profile>\n",[724,1561,1562,1567,1572,1577,1582,1587,1592,1597,1602,1607,1612,1617,1622,1627],{"__ignoreMap":280},[727,1563,1564],{"class":729,"line":730},[727,1565,1566],{"emptyLinePlaceholder":302},"\n",[727,1568,1569],{"class":729,"line":281},[727,1570,1571],{}," \u003Cprofile>\n",[727,1573,1574],{"class":729,"line":286},[727,1575,1576],{}," \u003Cid>xyz-test\u003C/id>\n",[727,1578,1579],{"class":729,"line":1149},[727,1580,1581],{}," \u003Cproperties>\n",[727,1583,1584],{"class":729,"line":1155},[727,1585,1586],{}," \u003Cprojectname.branch>xyz\u003C/projectname.branch>\n",[727,1588,1589],{"class":729,"line":1161},[727,1590,1591],{}," \u003Cprojectname.environment>test\u003C/projectname.environment>\n",[727,1593,1594],{"class":729,"line":1167},[727,1595,1596],{}," \u003Cprojectname.dbName>dbname\u003C/projectname.dbName>\n",[727,1598,1599],{"class":729,"line":1173},[727,1600,1601],{}," \u003Cprojectname.liquibase.url>jdbc:oracle:thin:@192.168.224.234:1521:DBID\u003C/projectname.liquibase.url>\n",[727,1603,1604],{"class":729,"line":1179},[727,1605,1606],{}," \u003Cprojectname.liquibase.username>username\u003C/projectname.liquibase.username>\n",[727,1608,1609],{"class":729,"line":1185},[727,1610,1611],{}," \u003Cprojectname.liquibase.password>password\u003C/projectname.liquibase.password>\n",[727,1613,1614],{"class":729,"line":1191},[727,1615,1616],{}," \u003Cprojectname.liquibase.schemaName>schema\u003C/projectname.liquibase.schemaName>\n",[727,1618,1619],{"class":729,"line":1197},[727,1620,1621],{}," \u003Cprojectname.liquibase.changeLogFile>target/classes/path/to/changelog/db.changelog.xml\u003C/projectname.liquibase.changeLogFile>\n",[727,1623,1624],{"class":729,"line":1203},[727,1625,1626],{}," \u003C/properties>\n",[727,1628,1629],{"class":729,"line":1209},[727,1630,1631],{}," \u003C/profile>\n",[19,1633,1634],{},"With this config, it uses the property file target/classes/liquibase-test.properties (keep in mind, the file initially\nlies in the folder src/main/resources, but because we build the project before we execute liquibase, it is then located\nunder target/classes/ , with its parameters replaced by our properties).",[19,1636,1637],{},"liquibase-test.properties:",[718,1639,1641],{"className":720,"code":1640,"language":722,"meta":280,"style":280},"changeLogFile=${projectname.liquibase.changeLogFile}\ndriver=oracle.jdbc.OracleDriver\nurl=${projectname.liquibase.url}\nusername=${projectname.liquibase.username}\npassword=${projectname.liquibase.password}\ndefaultSchemaName=${projectname.liquibase.schemaName}\nverbose=true\ndropFirst=false\n",[724,1642,1643,1648,1653,1658,1663,1668,1673,1678],{"__ignoreMap":280},[727,1644,1645],{"class":729,"line":730},[727,1646,1647],{},"changeLogFile=${projectname.liquibase.changeLogFile}\n",[727,1649,1650],{"class":729,"line":281},[727,1651,1652],{},"driver=oracle.jdbc.OracleDriver\n",[727,1654,1655],{"class":729,"line":286},[727,1656,1657],{},"url=${projectname.liquibase.url}\n",[727,1659,1660],{"class":729,"line":1149},[727,1661,1662],{},"username=${projectname.liquibase.username}\n",[727,1664,1665],{"class":729,"line":1155},[727,1666,1667],{},"password=${projectname.liquibase.password}\n",[727,1669,1670],{"class":729,"line":1161},[727,1671,1672],{},"defaultSchemaName=${projectname.liquibase.schemaName}\n",[727,1674,1675],{"class":729,"line":1167},[727,1676,1677],{},"verbose=true\n",[727,1679,1680],{"class":729,"line":1173},[727,1681,1682],{},"dropFirst=false\n",[19,1684,1685],{},"Here we map our properties from the profiles to the actual liquibase property names and also set a few other liquibase\nconfigs.",[19,1687,1688],{},"For scripts you need to execute in another schema as the one the db user has set as the default schema, we also set the\ndefaultSchemaName property of liquibase (mainly the case, if we execute scripts as the SYSDBA user).",[63,1690,1692],{"id":1691},"execution-conclusion","Execution & Conclusion",[19,1694,1695],{},"Because of the use of maven, we can execute all of the changes from our local machines very easy:",[718,1697,1699],{"className":720,"code":1698,"language":722,"meta":280,"style":280},"mvn clean install -Pxyz-test\n",[724,1700,1701],{"__ignoreMap":280},[727,1702,1703],{"class":729,"line":730},[727,1704,1698],{},[19,1706,1707],{},"If you connect against a remote server, you are even warned with a dialogue that contains the database name, url and\nusername, it wants to execute the scripts on, before the scripts are actually executed. So you can check them again and\nabort the migration if you used the wrong profile.",[19,1709,1710],{},"With this setup we can now add scripts for only one branch, multiple branches, or all branches, without having to worry\nto forget to add one change to a branch and leaving the error unnoticed. Even if we forget to put some file in the\nfolder of one branch, our changelog file is global for all branches! So if we try to execute it the next time, liquibase\nnotices the missing file and informs us about this (and aborts the execution). And because we don’t have different\nfolders for the environments, but only the branches, this gets noticed on the test machines.",[19,1712,1713],{},"Please let us know what you think of our approach and if you know an even better one!",[1024,1715,1026],{},{"title":280,"searchDepth":281,"depth":281,"links":1717},[1718,1719,1725],{"id":1069,"depth":281,"text":1070},{"id":1113,"depth":281,"text":1114,"children":1720},[1721,1722,1723,1724],{"id":1117,"depth":286,"text":1118},{"id":1353,"depth":286,"text":1354},{"id":1460,"depth":286,"text":1461},{"id":1549,"depth":286,"text":1550},{"id":1691,"depth":281,"text":1692},[296],"2013-04-12T11:19:55","In this post, we want to show you our Liquibase setup in a larger scale project that we’ve\\nbeen developing for some time now.","https://synyx.de/blog/liquibase-our-setup-in-a-larger-scale-project/",{},"/blog/liquibase-our-setup-in-a-larger-scale-project",{"title":1050,"description":1733},"In this post, we want to show you our Liquibase setup in a larger scale project that we’ve\nbeen developing for some time now.","blog/liquibase-our-setup-in-a-larger-scale-project",[1736,1737,1738,1739],"database","database-change-management","database-migration","liquibase","In this post, we want to show you our Liquibase setup in a larger scale project that we’ve been developing for some time now. Gather Requirements First off, a bit…","Edhd0rsE8wWORJZowAN7nLiXbUivWvpaqbjev0vh3MQ",{"id":1743,"title":1744,"author":1745,"body":1747,"category":1870,"date":1871,"description":1872,"extension":299,"link":1873,"meta":1874,"navigation":302,"path":1875,"seo":1876,"slug":1751,"stem":1877,"tags":1878,"teaser":1880,"__hash__":1881},"blog/blog/database-migration-using-flyway-and-spring-and-existing-data.md","Database Migration using Flyway and Spring (and existing Data)",[1746],"kannegiesser",{"type":12,"value":1748,"toc":1868},[1749,1752,1755,1762,1765,1768,1776,1779,1784,1787,1790,1796,1800,1804,1808,1811,1817],[15,1750,1744],{"id":1751},"database-migration-using-flyway-and-spring-and-existing-data",[19,1753,1754],{},"My team and I are currently working on an project we first started in early 2010. The application is in production\nsince sometime late 2010 and there has been no active development except for minor enhancements and bugfixes since then.\nEven if our code, processes and tools were good in 2010 we’ve improved a lot since then. Working on my old projects is\none of the occasions, where this becomes most evident.",[19,1756,1757,1758,1761],{},"When we start a new project today we usually use the database migration tool ",[23,1759,1065],{"href":1063,"rel":1760},[27],"\nright from the beginning. The tool keeps code and the database schema in sync and usually takes care of automatic\nmigration during application startup.",[19,1763,1764],{},"Back then we usually used SQL scripts, which had to be executed manually during deployment, to keep the database up to\ndate. Out of laziness or lack of time, this was also the first approach we took this week to handle database changes.\nThese scripts are checked into version control along with any code changes.",[19,1766,1767],{},"This may work pretty well in the beginning, but can also become annoying very fast: Everything you’ve to do manually is\ndestined to fail some time. ALL THE TIME! It fails on my colleagues working machines, it fails on our continuous\nintegration server (Jenkins) and it will probably fail hard on production, if you don’t pay enough attention during a\ndeployment.",[19,1769,1770,1771,135],{},"So there we were, about 60 minutes ago, standing there with a database dump from production and a bunch of SQL scripts,\nwhich accumulated during this week of development. Well, it is friday and I wanted to test something new so I remembered\na talk I attended earlier this year about an alternative to Liquibase: ",[23,1772,1775],{"href":1773,"rel":1774},"http://code.google.com/p/flyway/",[27],"flyway",[19,1777,1778],{},"What it basically does, is to execute a bunch of SQL scripts it hasn’t already executed on the given database. To get\nstarted I saved the dump of the production system into the db/migration/ package of our web application:",[19,1780,1781],{},[724,1782,1783],{},"mkdir -p src/main/respources/db/migration/ && cp prod_dump.sql src/main/resources/db/migration/V1_initial_import.sql",[19,1785,1786],{},"As many of our applications, this one too is based on Spring and Maven. So I added the flyway dependency to our pom.xml\nand also some XML to the bean configuration.",[19,1788,1789],{},"pom.xml:",[19,1791,1792,1793],{},"`",[1794,1795],"dependency",{},[1797,1798,1799],"group-id",{},"\ncom.googlecode.flyway\n",[1801,1802,1803],"artifact-id",{},"\nflyway-core\n",[1805,1806,1807],"version",{},"\n1.7\n",[19,1809,1810],{},"Even if there is a Maven plugin to execute the migration scripts, we got used to migrating the database during the\napplication boot process (because you will never have to think about it again, it simply migrates…). So we add the\nflyway bean to our bean configuration file. It is important that the flyway bean is instantiated early because it has to\nmigrate the database before anyone else uses it. In our case “anyone” is actually the EntityManager, so i configured the\npersistenceUnitManager to depend on flyway (which means flyway is running first):",[19,1812,1792,1813],{},[1814,1815],"bean",{"id":1775,"init-method":1816},"migrate",[1818,1819,1821,1824,1827],"property",{"name":1820,"ref":1820},"dataSource",[19,1822,1823],{},"\u003Cbean id=\"persistenceUnitManager\" depends-on=\"flyway\"",[19,1825,1826],{},"class=\"org.springframework.orm.jpa.persistenceunit.DefaultPersistenceUnitManager\">",[1818,1828,1830,1831,1840,1847,1850,1853,1856,1859,1862,1865],{"name":1829,"ref":1820},"defaultDataSource","\n\n`\n",[19,1832,1833,1834,1839],{},"Of course there are several configuration options for the flyway object. You can refer to\nthe ",[23,1835,1838],{"href":1836,"rel":1837},"http://code.google.com/p/flyway/wiki/ApplicationIntegration",[27],"documentation"," for further details.",[19,1841,1842,1843,1846],{},"By default, flyway will now search for SQL scripts in your classpath. It expects the scripts in the db.migration\npackage, following a particular naming scheme: Vxxx",[85,1844,1845],{},"description.sql, just like the one we already created earlier (\nV1_initial_import.sql). It will also remember at which version the database currently is and will only execute scripts\nit has not executed so far. So when we start our application flyway will find our script and will execute it. Afterwards\nit will know, that the database is at version 1 and will not execute the V1"," file again. This will only work on an\nempty database so you should drop and create your local database at this point.",[19,1848,1849],{},"`5:29:11,362 INFO .flyway.core.metadatatable.MetaDataTable: 111 - Creating Metadata table: schema_version (Schema: mydb)",[19,1851,1852],{},"15:29:11,408 INFO glecode.flyway.core.migration.DbMigrator: 120 - Current schema version: null",[19,1854,1855],{},"15:29:11,412 INFO glecode.flyway.core.migration.DbMigrator: 205 - Migrating to version 1",[19,1857,1858],{},"15:29:24,694 INFO glecode.flyway.core.migration.DbMigrator: 191 - Successfully applied 1 migrations (execution time 00:\n1.290s).`",[19,1860,1861],{},"Now if I have database changes, I simply add a new SQL file containing the change with the prefix V2__ and so on. If\nmy colleagues update their working copy they will also get my SQL changes and flyway will execute it during application\nbootup (or integration-test) and nobody has to do this manually anymore.",[19,1863,1864],{},"Ok, nice. But what about production? When we deploy the new version of the app we also want the scripts to be executed\nbut not the initial import, right? I dont want to execute “drop database dbname; create database dbname;” there. Flyway\ninitializes itself on the first start but only if the database it writes to is empty. So the migration will fail on\nproduction.",[19,1866,1867],{},"For this case flyway also comes with a goal that creates the metadata tables. It comes with the ability, to initialize\nyour metadata tables at any given version. You can accomplish this by code (call init() on the flyway object), via the\nMaven plugin (flyway:init) or on the commandline. Because I did not want to install any extra software on the production\nmachine, I simply prepared an SQL dump of the metadata table (schema_version) right after the initial import was\nexecuted. This will now be executed against the production database right before the next deployment. Yes, manually… but\nfor the last time ;).",{"title":280,"searchDepth":281,"depth":281,"links":1869},[],[296],"2012-10-05T18:37:36","My team and I are currently working on an project we first started in early 2010. The application is in production\\nsince sometime late 2010 and there has been no active development except for minor enhancements and bugfixes since then.\\nEven if our code, processes and tools were good in 2010 we’ve improved a lot since then. Working on my old projects is\\none of the occasions, where this becomes most evident.","https://synyx.de/blog/database-migration-using-flyway-and-spring-and-existing-data/",{},"/blog/database-migration-using-flyway-and-spring-and-existing-data",{"title":1744,"description":1754},"blog/database-migration-using-flyway-and-spring-and-existing-data",[1736,1775,1739,1879],"spring","My team and I are currently working on an project we first started in early 2010. The application is in production since sometime late 2010 and there has been no…","TnwY5J1dIhZAwIJnQj9D6OgKSWLYQ5REWmVXTVhYu3U",{"id":1883,"title":1884,"author":1885,"body":1887,"category":2518,"date":2519,"description":2520,"extension":299,"link":2521,"meta":2522,"navigation":302,"path":2523,"seo":2524,"slug":1891,"stem":2525,"tags":2526,"teaser":2529,"__hash__":2530},"blog/blog/migrating-data-with-liquibase.md","Migrating data with Liquibase",[1886],"mueller",{"type":12,"value":1888,"toc":2516},[1889,1892,1895,1898,1901,1904,1951,1954,2031,2034,2048,2051,2054,2114,2117,2155,2158,2204,2207,2210,2224,2227,2293,2296,2299,2357,2360,2363,2366,2443,2446,2449,2491,2494,2508,2511,2514],[15,1890,1884],{"id":1891},"migrating-data-with-liquibase",[19,1893,1894],{},"Recently, we started integrating Liquibase as a database schema migration tool into most of my team’s projects, for both\nnew from-scratch projects and already existing ones. Liquibase is great because it allows us to use an SCM tool like\nGit to manage different revisions of an applications database schema – or more specifically, the changes required to\nmigrate the database schema from one revision to another.",[19,1896,1897],{},"While migrating database schemas seems like a pretty straight-forward task at the beginning, things get more\ncomplicated as soon as you want to roll back schema changes without dropping your database (and then rebuilding it).\nLiquibase also supports migrating your data across schema changes, in both directions. But lets start with the basics.",[19,1899,1900],{},"For this example, I only used the Liquibase command line interface, along with the basic MySQL command line client. Of\ncourse, Liquibase also integrates nicely with Maven (as a Maven goal) or Spring (as a bean that executes during context\ninitialization).",[19,1902,1903],{},"I start with a very basic table called „Person“, consisting only of an ID (primary key) and a name:",[718,1905,1907],{"className":720,"code":1906,"language":722,"meta":280,"style":280},"\nmysql> describe Person;\n+-------+--------------+------+-----+---------+----------------+\n| Field | Type | Null | Key | Default | Extra |\n+-------+--------------+------+-----+---------+----------------+\n| id | bigint(20) | NO | PRI | NULL | auto_increment |\n| name | varchar(255) | NO | UNI | NULL | |\n+-------+--------------+------+-----+---------+----------------+\n2 rows in set (0.00 sec)\n\n",[724,1908,1909,1913,1918,1923,1928,1932,1937,1942,1946],{"__ignoreMap":280},[727,1910,1911],{"class":729,"line":730},[727,1912,1566],{"emptyLinePlaceholder":302},[727,1914,1915],{"class":729,"line":281},[727,1916,1917],{},"mysql> describe Person;\n",[727,1919,1920],{"class":729,"line":286},[727,1921,1922],{},"+-------+--------------+------+-----+---------+----------------+\n",[727,1924,1925],{"class":729,"line":1149},[727,1926,1927],{},"| Field | Type | Null | Key | Default | Extra |\n",[727,1929,1930],{"class":729,"line":1155},[727,1931,1922],{},[727,1933,1934],{"class":729,"line":1161},[727,1935,1936],{},"| id | bigint(20) | NO | PRI | NULL | auto_increment |\n",[727,1938,1939],{"class":729,"line":1167},[727,1940,1941],{},"| name | varchar(255) | NO | UNI | NULL | |\n",[727,1943,1944],{"class":729,"line":1173},[727,1945,1922],{},[727,1947,1948],{"class":729,"line":1179},[727,1949,1950],{},"2 rows in set (0.00 sec)\n",[19,1952,1953],{},"Liquibase uses so-called changesets, which are XML-snippets used to describe DDL statements. They are organized in\nchange log files. The following change set is used to create a table (via the „createTable“-tag) and two columns (via\nthe „column“-tag)",[718,1955,1957],{"className":1127,"code":1956,"language":1129,"meta":280,"style":280},"\n\u003Cdatabasechangelog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemalocation=\"http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd\">\n \u003Cchangeset author=\"mueller@synyx.de\" id=\"1\" runonchange=\"true\">\n \u003Ccreatetable tablename=\"Person\">\n \u003Ccolumn autoincrement=\"true\" name=\"id\" type=\"BIGINT\">\n \u003Cconstraints nullable=\"false\" primarykey=\"true\">\n \u003C/constraints>\n \u003C/column>\n \u003Ccolumn name=\"name\" type=\"VARCHAR(255)\">\n \u003Cconstraints nullable=\"false\">\n \u003C/constraints>\n \u003C/column>\n \u003C/createtable>\n \u003C/changeset>\n\u003C/databasechangelog>\n\n",[724,1958,1959,1963,1968,1973,1978,1983,1988,1993,1998,2003,2008,2012,2016,2021,2026],{"__ignoreMap":280},[727,1960,1961],{"class":729,"line":730},[727,1962,1566],{"emptyLinePlaceholder":302},[727,1964,1965],{"class":729,"line":281},[727,1966,1967],{},"\u003Cdatabasechangelog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemalocation=\"http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd\">\n",[727,1969,1970],{"class":729,"line":286},[727,1971,1972],{}," \u003Cchangeset author=\"mueller@synyx.de\" id=\"1\" runonchange=\"true\">\n",[727,1974,1975],{"class":729,"line":1149},[727,1976,1977],{}," \u003Ccreatetable tablename=\"Person\">\n",[727,1979,1980],{"class":729,"line":1155},[727,1981,1982],{}," \u003Ccolumn autoincrement=\"true\" name=\"id\" type=\"BIGINT\">\n",[727,1984,1985],{"class":729,"line":1161},[727,1986,1987],{}," \u003Cconstraints nullable=\"false\" primarykey=\"true\">\n",[727,1989,1990],{"class":729,"line":1167},[727,1991,1992],{}," \u003C/constraints>\n",[727,1994,1995],{"class":729,"line":1173},[727,1996,1997],{}," \u003C/column>\n",[727,1999,2000],{"class":729,"line":1179},[727,2001,2002],{}," \u003Ccolumn name=\"name\" type=\"VARCHAR(255)\">\n",[727,2004,2005],{"class":729,"line":1185},[727,2006,2007],{}," \u003Cconstraints nullable=\"false\">\n",[727,2009,2010],{"class":729,"line":1191},[727,2011,1992],{},[727,2013,2014],{"class":729,"line":1197},[727,2015,1997],{},[727,2017,2018],{"class":729,"line":1203},[727,2019,2020],{}," \u003C/createtable>\n",[727,2022,2023],{"class":729,"line":1209},[727,2024,2025],{}," \u003C/changeset>\n",[727,2027,2028],{"class":729,"line":1215},[727,2029,2030],{},"\u003C/databasechangelog>\n",[19,2032,2033],{},"When I run Liquibase via command line, it sets up the „Person“ table. The relevant command is „update“:",[718,2035,2037],{"className":720,"code":2036,"language":722,"meta":280,"style":280},"\n./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=db.changelog-0.1.0.xml \u003Cb>update\u003C/b>\n\n",[724,2038,2039,2043],{"__ignoreMap":280},[727,2040,2041],{"class":729,"line":730},[727,2042,1566],{"emptyLinePlaceholder":302},[727,2044,2045],{"class":729,"line":281},[727,2046,2047],{},"./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=db.changelog-0.1.0.xml \u003Cb>update\u003C/b>\n",[19,2049,2050],{},"Liquibase already knows how to roll back certain changesets, like the „createTable“ changeset above. If we call the\ncommand line client with „rollbackCount 1“ instead of „update“, it rolls back the last changeset it executed, and the\n„Person“ table is gone.",[19,2052,2053],{},"Other changesets cannot be rolled back automatically. Consider the following „insert“-changeset that inserts an entry\ninto our „Person“ table:",[718,2055,2057],{"className":1127,"code":2056,"language":1129,"meta":280,"style":280},"\n\u003Cdatabasechangelog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemalocation=\"http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd\">\n \u003Cchangeset author=\"mueller@synyx.de\" id=\"init-1\">\n \u003Cinsert tablename=\"Person\">\n \u003Ccolumn name=\"name\" value=\"John Doe\">\n \u003C/column>\n \u003C/insert>\n \u003Crollback>\n DELETE FROM Person WHERE name LIKE 'John Doe';\n \u003C/rollback>\n \u003C/changeset>\n\u003C/databasechangelog>\n\n",[724,2058,2059,2063,2067,2072,2077,2082,2086,2091,2096,2101,2106,2110],{"__ignoreMap":280},[727,2060,2061],{"class":729,"line":730},[727,2062,1566],{"emptyLinePlaceholder":302},[727,2064,2065],{"class":729,"line":281},[727,2066,1967],{},[727,2068,2069],{"class":729,"line":286},[727,2070,2071],{}," \u003Cchangeset author=\"mueller@synyx.de\" id=\"init-1\">\n",[727,2073,2074],{"class":729,"line":1149},[727,2075,2076],{}," \u003Cinsert tablename=\"Person\">\n",[727,2078,2079],{"class":729,"line":1155},[727,2080,2081],{}," \u003Ccolumn name=\"name\" value=\"John Doe\">\n",[727,2083,2084],{"class":729,"line":1161},[727,2085,1997],{},[727,2087,2088],{"class":729,"line":1167},[727,2089,2090],{}," \u003C/insert>\n",[727,2092,2093],{"class":729,"line":1173},[727,2094,2095],{}," \u003Crollback>\n",[727,2097,2098],{"class":729,"line":1179},[727,2099,2100],{}," DELETE FROM Person WHERE name LIKE 'John Doe';\n",[727,2102,2103],{"class":729,"line":1185},[727,2104,2105],{}," \u003C/rollback>\n",[727,2107,2108],{"class":729,"line":1191},[727,2109,2025],{},[727,2111,2112],{"class":729,"line":1197},[727,2113,2030],{},[19,2115,2116],{},"I manually added a „rollback“-tag containg an SQL statement that reverses the changset. Note that the „rollback“-tag\ncan contain either SQL statements as text or certain Liquibase refactoring tags. Since we now have two change log xml\nfiles, I created a „master“-file that imports the other files in the order in which they should be executed:",[718,2118,2120],{"className":1127,"code":2119,"language":1129,"meta":280,"style":280},"\n\u003C?xml version=\"1.0\" encoding=\"UTF-8\"?>\n\u003Cdatabasechangelog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog/1.9\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemalocation=\"http://www.liquibase.org/xml/ns/dbchangelog/1.9\n http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-1.9.xsd\">\n \u003Cinclude file=\"db.changelog-0.1.0.xml\">\u003C/include>\n \u003Cinclude file=\"db.changelog-0.1.0.init.xml\">\u003C/include>\n\u003C/databasechangelog>\n\n",[724,2121,2122,2126,2131,2136,2141,2146,2151],{"__ignoreMap":280},[727,2123,2124],{"class":729,"line":730},[727,2125,1566],{"emptyLinePlaceholder":302},[727,2127,2128],{"class":729,"line":281},[727,2129,2130],{},"\u003C?xml version=\"1.0\" encoding=\"UTF-8\"?>\n",[727,2132,2133],{"class":729,"line":286},[727,2134,2135],{},"\u003Cdatabasechangelog xmlns=\"http://www.liquibase.org/xml/ns/dbchangelog/1.9\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemalocation=\"http://www.liquibase.org/xml/ns/dbchangelog/1.9\n",[727,2137,2138],{"class":729,"line":1149},[727,2139,2140],{}," http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-1.9.xsd\">\n",[727,2142,2143],{"class":729,"line":1155},[727,2144,2145],{}," \u003Cinclude file=\"db.changelog-0.1.0.xml\">\u003C/include>\n",[727,2147,2148],{"class":729,"line":1161},[727,2149,2150],{}," \u003Cinclude file=\"db.changelog-0.1.0.init.xml\">\u003C/include>\n",[727,2152,2153],{"class":729,"line":1167},[727,2154,2030],{},[19,2156,2157],{},"If we run the „update“ command with the master change log file, it checks wether the first changeset was already\nexecuted (depending on you rolled it back or not) and then executes the second changeset that adds a „Person“ entry. To\nmake this work, Liquibase create a helper table called „DATABASECHANGELOGS“ containg already-executed change sets along\nwith a hash value (to make sure no-one modifies changesets once they have been executed):",[718,2159,2161],{"className":720,"code":2160,"language":722,"meta":280,"style":280},"\nmysql> select id, md5sum, description from DATABASECHANGELOG;\n+--------+------------------------------------+--------------+\n| id | md5sum | description |\n+--------+------------------------------------+--------------+\n| 1 | 3:5a36f447e90b35c3802cb6fe16cb12a7 | Create Table |\n| init-1 | 3:43c29e0011ebfcfd9cfbbb8450179a41 | Insert Row |\n+--------+------------------------------------+--------------+\n2 rows in set (0.00 sec)\n\n",[724,2162,2163,2167,2172,2177,2182,2186,2191,2196,2200],{"__ignoreMap":280},[727,2164,2165],{"class":729,"line":730},[727,2166,1566],{"emptyLinePlaceholder":302},[727,2168,2169],{"class":729,"line":281},[727,2170,2171],{},"mysql> select id, md5sum, description from DATABASECHANGELOG;\n",[727,2173,2174],{"class":729,"line":286},[727,2175,2176],{},"+--------+------------------------------------+--------------+\n",[727,2178,2179],{"class":729,"line":1149},[727,2180,2181],{},"| id | md5sum | description |\n",[727,2183,2184],{"class":729,"line":1155},[727,2185,2176],{},[727,2187,2188],{"class":729,"line":1161},[727,2189,2190],{},"| 1 | 3:5a36f447e90b35c3802cb6fe16cb12a7 | Create Table |\n",[727,2192,2193],{"class":729,"line":1167},[727,2194,2195],{},"| init-1 | 3:43c29e0011ebfcfd9cfbbb8450179a41 | Insert Row |\n",[727,2197,2198],{"class":729,"line":1173},[727,2199,2176],{},[727,2201,2202],{"class":729,"line":1179},[727,2203,1950],{},[19,2205,2206],{},"Now that we got the basics running, lets try something more challenging: an actual change to our schema that requires\nboth schema and data migration. Our „Person“ table currently has only a name column, and we decided that we want to\nsplit it up into a „firstname“ and a „lastname“ column.",[19,2208,2209],{},"Before beginning work, I have Liquibase „tag“ the database so that we can roll back to this tag later on:",[718,2211,2213],{"className":720,"code":2212,"language":722,"meta":280,"style":280},"\n./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=changelog-master.xml \u003Cb>tag liquiblog_0_1_0\u003C/b>\n\n",[724,2214,2215,2219],{"__ignoreMap":280},[727,2216,2217],{"class":729,"line":730},[727,2218,1566],{"emptyLinePlaceholder":302},[727,2220,2221],{"class":729,"line":281},[727,2222,2223],{},"./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=changelog-master.xml \u003Cb>tag liquiblog_0_1_0\u003C/b>\n",[19,2225,2226],{},"I created a new change set that adds the two new columns:",[718,2228,2230],{"className":1127,"code":2229,"language":1129,"meta":280,"style":280},"\n\u003Cchangeset author=\"mueller@synyx.de\" id=\"1\" runonchange=\"true\">\n \u003Caddcolumn tablename=\"Person\">\n \u003Ccolumn name=\"firstname\" type=\"VARCHAR(255)\">\n \u003Cconstraints nullable=\"false\">\n \u003C/constraints>\n \u003C/column>\n \u003Ccolumn name=\"lastname\" type=\"VARCHAR(255)\">\n \u003Cconstraints nullable=\"false\">\n \u003C/constraints>\n \u003C/column>\n \u003C/addcolumn>\n\u003C/changeset>\n\n",[724,2231,2232,2236,2241,2246,2251,2256,2261,2266,2271,2275,2279,2283,2288],{"__ignoreMap":280},[727,2233,2234],{"class":729,"line":730},[727,2235,1566],{"emptyLinePlaceholder":302},[727,2237,2238],{"class":729,"line":281},[727,2239,2240],{},"\u003Cchangeset author=\"mueller@synyx.de\" id=\"1\" runonchange=\"true\">\n",[727,2242,2243],{"class":729,"line":286},[727,2244,2245],{}," \u003Caddcolumn tablename=\"Person\">\n",[727,2247,2248],{"class":729,"line":1149},[727,2249,2250],{}," \u003Ccolumn name=\"firstname\" type=\"VARCHAR(255)\">\n",[727,2252,2253],{"class":729,"line":1155},[727,2254,2255],{}," \u003Cconstraints nullable=\"false\">\n",[727,2257,2258],{"class":729,"line":1161},[727,2259,2260],{}," \u003C/constraints>\n",[727,2262,2263],{"class":729,"line":1167},[727,2264,2265],{}," \u003C/column>\n",[727,2267,2268],{"class":729,"line":1173},[727,2269,2270],{}," \u003Ccolumn name=\"lastname\" type=\"VARCHAR(255)\">\n",[727,2272,2273],{"class":729,"line":1179},[727,2274,2255],{},[727,2276,2277],{"class":729,"line":1185},[727,2278,2260],{},[727,2280,2281],{"class":729,"line":1191},[727,2282,2265],{},[727,2284,2285],{"class":729,"line":1197},[727,2286,2287],{}," \u003C/addcolumn>\n",[727,2289,2290],{"class":729,"line":1203},[727,2291,2292],{},"\u003C/changeset>\n",[19,2294,2295],{},"Once again, Liquibase knows how to roll back this change set, so we can skip the rollback tag.",[19,2297,2298],{},"Now that the table has two additional columns, we must take care of migrating our existing data to the new schema before\ndeleting the old, now obsolete „name“ column. Since data manipulation is not supported out-of-the-box by Liquibase,\nwe have to use its „sql“ tag to include native SQL statements within a changeset.",[718,2300,2302],{"className":1127,"code":2301,"language":1129,"meta":280,"style":280},"\n\u003Cchangeset author=\"mueller@synyx.de\" id=\"2\">\n \u003Csql>\n UPDATE Person SET firstname = SUBSTRING_INDEX(name, ' ', 1);\n UPDATE Person SET lastname = SUBSTRING_INDEX(name, ' ', -1);\n \u003C/sql>\n \u003Crollback>\n UPDATE Person SET firstname = '';\n UPDATE Person SET lastname = '';\n \u003C/rollback>\n\u003C/changeset>\n\n",[724,2303,2304,2308,2313,2318,2323,2328,2333,2338,2343,2348,2353],{"__ignoreMap":280},[727,2305,2306],{"class":729,"line":730},[727,2307,1566],{"emptyLinePlaceholder":302},[727,2309,2310],{"class":729,"line":281},[727,2311,2312],{},"\u003Cchangeset author=\"mueller@synyx.de\" id=\"2\">\n",[727,2314,2315],{"class":729,"line":286},[727,2316,2317],{}," \u003Csql>\n",[727,2319,2320],{"class":729,"line":1149},[727,2321,2322],{}," UPDATE Person SET firstname = SUBSTRING_INDEX(name, ' ', 1);\n",[727,2324,2325],{"class":729,"line":1155},[727,2326,2327],{}," UPDATE Person SET lastname = SUBSTRING_INDEX(name, ' ', -1);\n",[727,2329,2330],{"class":729,"line":1161},[727,2331,2332],{}," \u003C/sql>\n",[727,2334,2335],{"class":729,"line":1167},[727,2336,2337],{}," \u003Crollback>\n",[727,2339,2340],{"class":729,"line":1173},[727,2341,2342],{}," UPDATE Person SET firstname = '';\n",[727,2344,2345],{"class":729,"line":1179},[727,2346,2347],{}," UPDATE Person SET lastname = '';\n",[727,2349,2350],{"class":729,"line":1185},[727,2351,2352],{}," \u003C/rollback>\n",[727,2354,2355],{"class":729,"line":1191},[727,2356,2292],{},[19,2358,2359],{},"Note that the content of the „rollback“-tag is kind of redundant, but the tag itself is required because Liquibase\nprevents us from rolling back changesets that cannot be rolled back implicitly and have no explicit rollback tag.",[19,2361,2362],{},"Once again, after executing Liquibase with the „update“-option, the new changeset is run, and our newly-created\n„firstname“ and „lastname“ columns now contain data.",[19,2364,2365],{},"Finally, I want to remove the old „name“ column.",[718,2367,2369],{"className":1127,"code":2368,"language":1129,"meta":280,"style":280},"\n\u003Cchangeset author=\"mueller@synyx.de\" id=\"3\" runonchange=\"true\">\n \u003Cdropcolumn columnname=\"name\" tablename=\"Person\">\n \u003C/dropcolumn>\n \u003Crollback>\n \u003Caddcolumn tablename=\"Person\">\n \u003Ccolumn name=\"name\" type=\"VARCHAR(255)\">\n \u003Cconstraints nullable=\"false\">\n \u003C/constraints>\n \u003C/column>\n \u003C/addcolumn>\n \u003Csql>\n UPDATE Person SET name = CONCAT(firstname, CONCAT(' ', lastname));\n \u003C/sql>\n \u003C/rollback>\n\u003C/changeset>\n\n",[724,2370,2371,2375,2380,2385,2390,2394,2399,2403,2407,2411,2415,2420,2425,2430,2435,2439],{"__ignoreMap":280},[727,2372,2373],{"class":729,"line":730},[727,2374,1566],{"emptyLinePlaceholder":302},[727,2376,2377],{"class":729,"line":281},[727,2378,2379],{},"\u003Cchangeset author=\"mueller@synyx.de\" id=\"3\" runonchange=\"true\">\n",[727,2381,2382],{"class":729,"line":286},[727,2383,2384],{}," \u003Cdropcolumn columnname=\"name\" tablename=\"Person\">\n",[727,2386,2387],{"class":729,"line":1149},[727,2388,2389],{}," \u003C/dropcolumn>\n",[727,2391,2392],{"class":729,"line":1155},[727,2393,2337],{},[727,2395,2396],{"class":729,"line":1161},[727,2397,2398],{}," \u003Caddcolumn tablename=\"Person\">\n",[727,2400,2401],{"class":729,"line":1167},[727,2402,2002],{},[727,2404,2405],{"class":729,"line":1173},[727,2406,2007],{},[727,2408,2409],{"class":729,"line":1179},[727,2410,1992],{},[727,2412,2413],{"class":729,"line":1185},[727,2414,1997],{},[727,2416,2417],{"class":729,"line":1191},[727,2418,2419],{}," \u003C/addcolumn>\n",[727,2421,2422],{"class":729,"line":1197},[727,2423,2424],{}," \u003Csql>\n",[727,2426,2427],{"class":729,"line":1203},[727,2428,2429],{}," UPDATE Person SET name = CONCAT(firstname, CONCAT(' ', lastname));\n",[727,2431,2432],{"class":729,"line":1209},[727,2433,2434],{}," \u003C/sql>\n",[727,2436,2437],{"class":729,"line":1215},[727,2438,2352],{},[727,2440,2441],{"class":729,"line":1221},[727,2442,2292],{},[19,2444,2445],{},"Again, the changeset itself is quite simple because Liquibase supports dropping columns, but the „rollback“-tag is more\ncomplicated: I first re-add the old „name“-column using the standart „addColumn“-tag, and then I used a custom SQL\nstatement to set the columns value.",[19,2447,2448],{},"We end up with a new database schema, complete with data:",[718,2450,2452],{"className":720,"code":2451,"language":722,"meta":280,"style":280},"\nmysql> select * from Person;\n+----+-----------+------------+\n| id | firstname | lastname |\n+----+-----------+------------+\n| 1 | John | Doe |\n+----+-----------+------------+\n1 rows in set (0.00 sec)\n\n",[724,2453,2454,2458,2463,2468,2473,2477,2482,2486],{"__ignoreMap":280},[727,2455,2456],{"class":729,"line":730},[727,2457,1566],{"emptyLinePlaceholder":302},[727,2459,2460],{"class":729,"line":281},[727,2461,2462],{},"mysql> select * from Person;\n",[727,2464,2465],{"class":729,"line":286},[727,2466,2467],{},"+----+-----------+------------+\n",[727,2469,2470],{"class":729,"line":1149},[727,2471,2472],{},"| id | firstname | lastname |\n",[727,2474,2475],{"class":729,"line":1155},[727,2476,2467],{},[727,2478,2479],{"class":729,"line":1161},[727,2480,2481],{},"| 1 | John | Doe |\n",[727,2483,2484],{"class":729,"line":1167},[727,2485,2467],{},[727,2487,2488],{"class":729,"line":1173},[727,2489,2490],{},"1 rows in set (0.00 sec)\n",[19,2492,2493],{},"Because we created a tag earlier and included rollback instructions in all our changesets, we can always roll back these\nmodifications without loosing any data! By running..",[718,2495,2497],{"className":720,"code":2496,"language":722,"meta":280,"style":280},"\n./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=changelog-master.xml \u003Cb>rollback liquiblog_0_1_0\u003C/b>\n\n",[724,2498,2499,2503],{"__ignoreMap":280},[727,2500,2501],{"class":729,"line":730},[727,2502,1566],{"emptyLinePlaceholder":302},[727,2504,2505],{"class":729,"line":281},[727,2506,2507],{},"./liquibase --url=jdbc:mysql://localhost:3306/liquiblog --driver=com.mysql.jdbc.Driver --username=root --password=\"\" --changeLogFile=changelog-master.xml \u003Cb>rollback liquiblog_0_1_0\u003C/b>\n",[19,2509,2510],{},"..we get our original database back!",[19,2512,2513],{},"Of course, the example with splitting / concatenating strings is a little far-fetched, but the same principles can be\napplied to more sophisticated refactorings. I came across the idea for this blog post when we had to split an existing\ndomain class (mapped to a single table) into an abstract base class and two subclasses, preferrably without losing data.",[1024,2515,1026],{},{"title":280,"searchDepth":281,"depth":281,"links":2517},[],[296],"2012-08-03T10:27:51","Recently, we started integrating Liquibase as a database schema migration tool into most of my team’s projects, for both\\nnew from-scratch projects and already existing ones. Liquibase is great because it allows us to use an SCM tool like\\nGit to manage different revisions of an applications database schema – or more specifically, the changes required to\\nmigrate the database schema from one revision to another.","https://synyx.de/blog/migrating-data-with-liquibase/",{},"/blog/migrating-data-with-liquibase",{"title":1884,"description":1894},"blog/migrating-data-with-liquibase",[1736,1739,2527,2528],"software-development","synyx","Recently, we started integrating Liquibase as a database schema migration tool into most of my team’s projects, for both new from-scratch projects and already existing ones. Liquibase is great because…","Vinp4k8Ts7cd7z-85P0-KPtpbF-5oJqchl7xudePQQI",{"id":2532,"title":2533,"author":2534,"body":2535,"category":2754,"date":2757,"description":2758,"extension":299,"link":2759,"meta":2760,"navigation":302,"path":2761,"seo":2762,"slug":2763,"stem":2764,"tags":2765,"teaser":2770,"__hash__":2771},"blog/blog/android-2-1-sqlite-problem-with-querybuilder-and-distinct.md","Android 2.1 SQLite: problem with QueryBuilder and Distinct",[1052],{"type":12,"value":2536,"toc":2752},[2537,2540,2543,2546,2728,2735,2738,2747,2750],[15,2538,2533],{"id":2539},"android-21-sqlite-problem-with-querybuilder-and-distinct",[19,2541,2542],{},"In a recent project I encountered a problem with SQLite on android 2.1. On later versions, my code worked perfectly,\nbut on 2.1 it crashed every time when trying to get a column from a cursor.",[19,2544,2545],{},"Here’s the simplified code:",[718,2547,2551],{"className":2548,"code":2549,"language":2550,"meta":280,"style":280},"language-java shiki shiki-themes github-light github-dark","//member, a SQLiteOpenHelper\nBackendOpenHelper helper;\n//...\npublic List \u003CExample> getExamples(String arg){\nSQLiteQueryBuilder builder = new SQLiteQueryBuilder();\n builder.setTables(\"example e JOIN\n secondtable s ON e.id = s.example_id\");\n Map\u003CString, String> projectionMap =\n new HashMap\u003CString, String>();\n projectionMap.put(\"id\", \"e.id\");\n //... put in some more values ...\n builder.setProjectionMap(projectionMap);\n builder.setDistinct(true);\n builder.appendWhere(\" e.someRow = ? \");\n //... some more wheres ...\n SQLiteDatabase db = helper.getReadableDatabase();\n String[] selectionArgs = new String[] {\n arg\n };\n Cursor cursor = builder.query(db, null,\n null, selectionArgs, null, null, null);\n if (cursor.moveToFirst()) {\n while (cursor.isAfterLast() == false) {\n int index = cursor.getColumnIndex(\"id\");\n //on android 2.1, index is returned as -1\n //on newer versions as 1\n int id = cursor.getInt(index);\n //crashes if index is -1\n //...\n cursor.moveToNext();\n }\n }\n cursor.close();\n //...\n}\n","java",[724,2552,2553,2558,2563,2568,2573,2578,2583,2588,2593,2598,2603,2608,2613,2618,2623,2628,2633,2638,2643,2648,2653,2658,2663,2668,2673,2678,2683,2688,2693,2698,2703,2708,2713,2718,2723],{"__ignoreMap":280},[727,2554,2555],{"class":729,"line":730},[727,2556,2557],{},"//member, a SQLiteOpenHelper\n",[727,2559,2560],{"class":729,"line":281},[727,2561,2562],{},"BackendOpenHelper helper;\n",[727,2564,2565],{"class":729,"line":286},[727,2566,2567],{},"//...\n",[727,2569,2570],{"class":729,"line":1149},[727,2571,2572],{},"public List \u003CExample> getExamples(String arg){\n",[727,2574,2575],{"class":729,"line":1155},[727,2576,2577],{},"SQLiteQueryBuilder builder = new SQLiteQueryBuilder();\n",[727,2579,2580],{"class":729,"line":1161},[727,2581,2582],{}," builder.setTables(\"example e JOIN\n",[727,2584,2585],{"class":729,"line":1167},[727,2586,2587],{}," secondtable s ON e.id = s.example_id\");\n",[727,2589,2590],{"class":729,"line":1173},[727,2591,2592],{}," Map\u003CString, String> projectionMap =\n",[727,2594,2595],{"class":729,"line":1179},[727,2596,2597],{}," new HashMap\u003CString, String>();\n",[727,2599,2600],{"class":729,"line":1185},[727,2601,2602],{}," projectionMap.put(\"id\", \"e.id\");\n",[727,2604,2605],{"class":729,"line":1191},[727,2606,2607],{}," //... put in some more values ...\n",[727,2609,2610],{"class":729,"line":1197},[727,2611,2612],{}," builder.setProjectionMap(projectionMap);\n",[727,2614,2615],{"class":729,"line":1203},[727,2616,2617],{}," builder.setDistinct(true);\n",[727,2619,2620],{"class":729,"line":1209},[727,2621,2622],{}," builder.appendWhere(\" e.someRow = ? \");\n",[727,2624,2625],{"class":729,"line":1215},[727,2626,2627],{}," //... some more wheres ...\n",[727,2629,2630],{"class":729,"line":1221},[727,2631,2632],{}," SQLiteDatabase db = helper.getReadableDatabase();\n",[727,2634,2635],{"class":729,"line":1227},[727,2636,2637],{}," String[] selectionArgs = new String[] {\n",[727,2639,2640],{"class":729,"line":1233},[727,2641,2642],{}," arg\n",[727,2644,2645],{"class":729,"line":1239},[727,2646,2647],{}," };\n",[727,2649,2650],{"class":729,"line":1245},[727,2651,2652],{}," Cursor cursor = builder.query(db, null,\n",[727,2654,2655],{"class":729,"line":1251},[727,2656,2657],{}," null, selectionArgs, null, null, null);\n",[727,2659,2660],{"class":729,"line":1257},[727,2661,2662],{}," if (cursor.moveToFirst()) {\n",[727,2664,2665],{"class":729,"line":1263},[727,2666,2667],{}," while (cursor.isAfterLast() == false) {\n",[727,2669,2670],{"class":729,"line":1269},[727,2671,2672],{}," int index = cursor.getColumnIndex(\"id\");\n",[727,2674,2675],{"class":729,"line":1275},[727,2676,2677],{}," //on android 2.1, index is returned as -1\n",[727,2679,2680],{"class":729,"line":1281},[727,2681,2682],{}," //on newer versions as 1\n",[727,2684,2685],{"class":729,"line":1287},[727,2686,2687],{}," int id = cursor.getInt(index);\n",[727,2689,2690],{"class":729,"line":1293},[727,2691,2692],{}," //crashes if index is -1\n",[727,2694,2695],{"class":729,"line":1299},[727,2696,2697],{}," //...\n",[727,2699,2700],{"class":729,"line":1305},[727,2701,2702],{}," cursor.moveToNext();\n",[727,2704,2705],{"class":729,"line":1311},[727,2706,2707],{}," }\n",[727,2709,2710],{"class":729,"line":1317},[727,2711,2712],{}," }\n",[727,2714,2715],{"class":729,"line":1323},[727,2716,2717],{}," cursor.close();\n",[727,2719,2720],{"class":729,"line":1329},[727,2721,2722],{}," //...\n",[727,2724,2725],{"class":729,"line":1335},[727,2726,2727],{},"}\n",[19,2729,2730,2731,2734],{},"After some research I found out that this apparently happens, when using ",[124,2732,2733],{},"distinct"," with the QueryBuilder on android\n2.1.",[19,2736,2737],{},"So a quick fix for this problem is to simply don’t use the getColumnIndex() method from the cursor, but instead just\naccess it by its id (Though you have to remember to change this part of the code if you make changes to your table\nrows).",[718,2739,2741],{"className":2548,"code":2740,"language":2550,"meta":280,"style":280}," int id = cursor.getInt(1);\n",[724,2742,2743],{"__ignoreMap":280},[727,2744,2745],{"class":729,"line":730},[727,2746,2740],{},[19,2748,2749],{},"I hope this will help someone who encounters the same problem, so he doesn’t have to search for a solution as long as I\nhad to.",[1024,2751,1026],{},{"title":280,"searchDepth":281,"depth":281,"links":2753},[],[2755,2756],"mobile-blog","tutorial","2012-05-22T09:39:08","In a recent project I encountered a problem with SQLite on android 2.1. On later versions, my code worked perfectly,\\nbut on 2.1 it crashed every time when trying to get a column from a cursor.","https://synyx.de/blog/android-2-1-sqlite-problem-with-querybuilder-and-distinct/",{},"/blog/android-2-1-sqlite-problem-with-querybuilder-and-distinct",{"title":2533,"description":2542},"android-2-1-sqlite-problem-with-querybuilder-and-distinct","blog/android-2-1-sqlite-problem-with-querybuilder-and-distinct",[2766,2767,1736,2768,2769],"2-1","android","mobile","sqlite","In a recent project I encountered a problem with SQLite on android 2.1. On later versions, my code worked perfectly, but on 2.1 it crashed every time when trying to…","x1Vxa8-B7BB8vb0rMrFKnHNmHw0JMSkiTZ3HMChf8mU",{"id":2773,"title":2774,"author":2775,"body":2776,"category":2834,"date":2836,"description":2837,"extension":299,"link":2838,"meta":2839,"navigation":302,"path":2840,"seo":2841,"slug":2780,"stem":2843,"tags":2844,"teaser":2849,"__hash__":2850},"blog/blog/synyx-messagesource-load-your-i18n-messages-from-database.md","Synyx MessageSource: Load your i18n messages from database",[1746],{"type":12,"value":2777,"toc":2832},[2778,2781,2790,2797,2812],[15,2779,2774],{"id":2780},"synyx-messagesource-load-your-i18n-messages-from-database",[19,2782,2783,2784,2789],{},"A while ago we wanted to store internationalisation for a project in database to allow (a subset of) users to create and\nupdate internationalisation using the application itself. When we searched the web for existing projects that allow to\ndo this we did not find a good and ready to use solution. This is why we decided to write the code ourselves and make it\navailable to others, especially since there was some public demand and it will probably not be shipped by SpringSource (\ncheck out ",[23,2785,2788],{"href":2786,"rel":2787},"http://www.google.de/search?q=spring+i18n+database",[27],"Google"," for details).",[19,2791,2792,2793,2796],{},"So today I’d like to announce our new Open Source project ",[85,2794,2795],{},"Synyx Messagesource for Spring"," business-friendly\npublished using Apache License, Version 2.0.",[19,2798,2799,2800,2803,2804,2807,2808,2811],{},"When you want to store internationalisation of your Spring-backed application in database, the project is the right\nthing to use. It provides an implementation of Springs ",[724,2801,2802],{},"MessageSource"," interface that is able to load and cache a set of\nmessages at once using a ",[724,2805,2806],{},"MessageProvider",". The project brings a configurable one that is able to read (and write) your\ni18n to database using JDBC. There is also support to import and export your messages to the “standard” i18n\n",[724,2809,2810],{},".properties"," files.",[19,2813,2814,2815,2820,2821,2826,2827,135],{},"You can find the projects homepage including documentation how to get started and how everything\nworks ",[23,2816,2819],{"href":2817,"rel":2818},"http://messagesource.synyx.org",[27],"here",". You should not get problems to get up and running after reading the\ninformation from the ",[23,2822,2825],{"href":2823,"rel":2824},"https://github.com/synyx/messagesource/wiki",[27],"projects Wiki",". If you’re having any trouble or\nfeature request feel free to contact us\nor ",[23,2828,2831],{"href":2829,"rel":2830},"https://github.com/synyx/messagesource/issues",[27],"create a ticket in the projects issue tracker",{"title":280,"searchDepth":281,"depth":281,"links":2833},[],[296,2835],"open-source-blog","2011-02-14T18:06:55","A while ago we wanted to store internationalisation for a project in database to allow (a subset of) users to create and\\nupdate internationalisation using the application itself. When we searched the web for existing projects that allow to\\ndo this we did not find a good and ready to use solution. This is why we decided to write the code ourselves and make it\\navailable to others, especially since there was some public demand and it will probably not be shipped by SpringSource (\\ncheck out Google for details).","https://synyx.de/blog/synyx-messagesource-load-your-i18n-messages-from-database/",{},"/blog/synyx-messagesource-load-your-i18n-messages-from-database",{"title":2774,"description":2842},"A while ago we wanted to store internationalisation for a project in database to allow (a subset of) users to create and\nupdate internationalisation using the application itself. When we searched the web for existing projects that allow to\ndo this we did not find a good and ready to use solution. This is why we decided to write the code ourselves and make it\navailable to others, especially since there was some public demand and it will probably not be shipped by SpringSource (\ncheck out Google for details).","blog/synyx-messagesource-load-your-i18n-messages-from-database",[1736,2845,2846,2847,2848,1879],"i18n","internationalisation","internationalization","open-source","A while ago we wanted to store internationalisation for a project in database to allow (a subset of) users to create and update internationalisation using the application itself. When we…","nyZcMPq1I7-esYkZoVwAgntWZKJrIJkCdwQvpagdS20",[2852,2855,2858,2861,2864,2867,2870,2873,2876,2879,2882,2885,2888,2891,2894,2897,2900,2903,2906,2909,2912,2915,2917,2920,2923,2926,2929,2931,2934,2937,2940,2943,2946,2949,2952,2955,2958,2961,2964,2967,2970,2973,2976,2979,2982,2985,2988,2991,2994,2997,3000,3003,3006,3009,3012,3014,3017,3020,3023,3025,3028,3031,3034,3036,3039,3042,3045,3048,3051,3054,3057,3060,3063,3066,3069,3072,3075,3078,3081,3083,3086,3089,3092,3095,3098,3101,3104,3107,3110,3112,3115,3118,3121,3124,3127,3130,3133,3136,3139,3141,3144,3147,3150,3153,3155,3158,3161,3164,3167,3170,3173,3176,3179,3182,3185,3188,3191,3194,3197,3200,3203,3206,3209,3212,3215,3218,3221,3224,3227,3230,3231,3234,3237,3240,3243,3246,3249,3252,3255,3258,3261,3264],{"slug":2853,"name":2854},"abel","Jennifer Abel",{"slug":2856,"name":2857},"allmendinger","Otto Allmendinger",{"slug":2859,"name":2860},"antony","Ben Antony",{"slug":2862,"name":2863},"arrasz","Joachim Arrasz",{"slug":2865,"name":2866},"bauer","David Bauer",{"slug":2868,"name":2869},"bechtold","Janine Bechtold",{"slug":2871,"name":2872},"boersig","Jasmin Börsig",{"slug":2874,"name":2875},"buch","Fabian Buch",{"slug":2877,"name":2878},"buchloh","Aljona Buchloh",{"slug":2880,"name":2881},"burgard","Julia Burgard",{"slug":2883,"name":2884},"caspar-schwedes","Caspar Schwedes",{"slug":2886,"name":2887},"christina-schmitt","Christina Schmitt",{"slug":2889,"name":2890},"clausen","Michael Clausen",{"slug":2892,"name":2893},"contargo_poetzsch","Thomas Pötzsch",{"slug":2895,"name":2896},"damrath","Sebastian Damrath",{"slug":2898,"name":2899},"daniel","Markus Daniel",{"slug":2901,"name":2902},"dasch","Julia Dasch",{"slug":2904,"name":2905},"denman","Joffrey Denman",{"slug":2907,"name":2908},"dfuchs","Daniel Fuchs",{"slug":2910,"name":2911},"dobler","Max Dobler",{"slug":2913,"name":2914},"dobriakov","Vladimir Dobriakov",{"slug":2916,"name":2916},"dreiqbik",{"slug":2918,"name":2919},"dschaefer","Denise Schäfer",{"slug":2921,"name":2922},"dschneider","Dominik Schneider",{"slug":2924,"name":2925},"duerlich","Isabell Duerlich",{"slug":2927,"name":2928},"dutkowski","Bernd Dutkowski",{"slug":2930,"name":2930},"eifler",{"slug":2932,"name":2933},"essig","Tim Essig",{"slug":2935,"name":2936},"ferstl","Maximilian Ferstl",{"slug":2938,"name":2939},"fey","Prisca Fey",{"slug":2941,"name":2942},"frank","Leonard Frank",{"slug":2944,"name":2945},"franke","Arnold Franke",{"slug":2947,"name":2948},"frischer","Nicolette Rudmann",{"slug":2950,"name":2951},"fuchs","Petra Fuchs",{"slug":2953,"name":2954},"gari","Sarah Gari",{"slug":2956,"name":2957},"gast","Gast",{"slug":2959,"name":2960},"graf","Johannes Graf",{"slug":2962,"name":2963},"grammlich","Daniela Grammlich",{"slug":2965,"name":2966},"guthardt","Sabrina Guthardt",{"slug":2968,"name":2969},"haeussler","Johannes Häussler",{"slug":2971,"name":2972},"hammann","Daniel Hammann",{"slug":2974,"name":2975},"heetel","Julian Heetel",{"slug":2977,"name":2978},"heft","Florian Heft",{"slug":2980,"name":2981},"heib","Sebastian Heib",{"slug":2983,"name":2984},"heisler","Ida Heisler",{"slug":2986,"name":2987},"helm","Patrick Helm",{"slug":2989,"name":2990},"herbold","Michael Herbold",{"slug":2992,"name":2993},"hofmann","Peter Hofmann",{"slug":2995,"name":2996},"hopf","Florian Hopf",{"slug":2998,"name":2999},"jaud","Alina Jaud",{"slug":3001,"name":3002},"jayasinghe","Robin De Silva Jayasinghe",{"slug":3004,"name":3005},"jbuch","Jonathan Buch",{"slug":3007,"name":3008},"junghanss","Gitta Junghanß",{"slug":3010,"name":3011},"kadyietska","Khrystyna Kadyietska",{"slug":1746,"name":3013},"Marc Kannegiesser",{"slug":3015,"name":3016},"karoly","Robert Károly",{"slug":3018,"name":3019},"karrasz","Katja Arrasz-Schepanski",{"slug":3021,"name":3022},"kaufmann","Florian Kaufmann",{"slug":10,"name":3024},"Mike Kesler",{"slug":3026,"name":3027},"kirchgaessner","Bettina Kirchgäßner",{"slug":3029,"name":3030},"klem","Yannic Klem",{"slug":3032,"name":3033},"klenk","Timo Klenk",{"slug":1052,"name":3035},"Tobias Knell",{"slug":3037,"name":3038},"knoll","Anna-Lena Knoll",{"slug":3040,"name":3041},"knorre","Matthias Knorre",{"slug":3043,"name":3044},"koenig","Melanie König",{"slug":3046,"name":3047},"kraft","Thomas Kraft",{"slug":3049,"name":3050},"krupicka","Florian Krupicka",{"slug":3052,"name":3053},"kuehn","Christian Kühn",{"slug":3055,"name":3056},"lange","Christian Lange",{"slug":3058,"name":3059},"larrasz","Luca Arrasz",{"slug":3061,"name":3062},"leist","Sascha Leist",{"slug":3064,"name":3065},"lihs","Michael Lihs",{"slug":3067,"name":3068},"linsin","David Linsin",{"slug":3070,"name":3071},"maniyar","Christian Maniyar",{"slug":3073,"name":3074},"martin","Björnie",{"slug":3076,"name":3077},"martin-koch","Martin Koch",{"slug":3079,"name":3080},"matt","Tobias Matt",{"slug":9,"name":3082},"Christian Mennerich",{"slug":3084,"name":3085},"menz","Alexander Menz",{"slug":3087,"name":3088},"meseck","Frederick Meseck",{"slug":3090,"name":3091},"messner","Oliver Messner",{"slug":3093,"name":3094},"michael-ploed","Michael Plöd",{"slug":3096,"name":3097},"mies","Marius Mies",{"slug":3099,"name":3100},"mihai","Alina Mihai",{"slug":3102,"name":3103},"moeller","Jörg Möller",{"slug":3105,"name":3106},"mohr","Rebecca Mohr",{"slug":3108,"name":3109},"moretti","David Moretti",{"slug":1886,"name":3111},"Sven Müller",{"slug":3113,"name":3114},"muessig","Alexander Müssig",{"slug":3116,"name":3117},"neupokoev","Grigory Neupokoev",{"slug":3119,"name":3120},"nussbaecher","Carmen Nussbächer",{"slug":3122,"name":3123},"ochs","Pascal Ochs",{"slug":3125,"name":3126},"oelhoff","Jan Oelhoff",{"slug":3128,"name":3129},"oengel","Yasin Öngel",{"slug":3131,"name":3132},"oezsoy","Enis Özsoy",{"slug":3134,"name":3135},"posch","Maya Posch",{"slug":3137,"name":3138},"ralfmueller","Ralf Müller",{"slug":3140,"name":3140},"redakteur",{"slug":3142,"name":3143},"reich","Michael Reich",{"slug":3145,"name":3146},"reinhard","Karl-Ludwig Reinhard",{"slug":3148,"name":3149},"rmueller","Rebecca Müller",{"slug":3151,"name":3152},"rosum","Jan Rosum",{"slug":3154,"name":3154},"rueckert",{"slug":3156,"name":3157},"ruessel","Sascha Rüssel",{"slug":3159,"name":3160},"sauter","Moritz Sauter",{"slug":3162,"name":3163},"schaefer","Julian Schäfer",{"slug":3165,"name":3166},"scherer","Petra Scherer",{"slug":3168,"name":3169},"schlicht","Anne Schlicht",{"slug":3171,"name":3172},"schmidt","Jürgen Schmidt",{"slug":3174,"name":3175},"schneider","Tobias Schneider",{"slug":3177,"name":3178},"seber","Benjamin Seber",{"slug":3180,"name":3181},"sommer","Marc Sommer",{"slug":3183,"name":3184},"speaker-fels","Jakob Fels",{"slug":3186,"name":3187},"speaker-gierke","Oliver Gierke",{"slug":3189,"name":3190},"speaker-krupa","Malte Krupa",{"slug":3192,"name":3193},"speaker-mader","Jochen Mader",{"slug":3195,"name":3196},"speaker-meusel","Tim Meusel",{"slug":3198,"name":3199},"speaker-milke","Oliver Milke",{"slug":3201,"name":3202},"speaker-paluch","Mark Paluch",{"slug":3204,"name":3205},"speaker-schad","Jörg Schad",{"slug":3207,"name":3208},"speaker-schalanda","Jochen Schalanda",{"slug":3210,"name":3211},"speaker-schauder","Jens Schauder",{"slug":3213,"name":3214},"speaker-unterstein","Johannes Unterstein",{"slug":3216,"name":3217},"speaker-wolff","Eberhard Wolff",{"slug":3219,"name":3220},"speaker-zoerner","Stefan Zörner",{"slug":3222,"name":3223},"stefan-belger","Stefan Belger",{"slug":3225,"name":3226},"steinegger","Roland Steinegger",{"slug":3228,"name":3229},"stern","sternchen synyx",{"slug":2528,"name":2528},{"slug":3232,"name":3233},"szulc","Mateusz Szulc",{"slug":3235,"name":3236},"tamara","Tamara Tunczinger",{"slug":3238,"name":3239},"theuer","Tobias Theuer",{"slug":3241,"name":3242},"thieme","Sandra Thieme",{"slug":3244,"name":3245},"thies-clasen","Marudor",{"slug":3247,"name":3248},"toernstroem","Olle Törnström",{"slug":3250,"name":3251},"ullinger","Max Ullinger",{"slug":3253,"name":3254},"ulrich","Stephan Ulrich",{"slug":3256,"name":3257},"wagner","Stefan Wagner",{"slug":3259,"name":3260},"weigel","Andreas Weigel",{"slug":3262,"name":3263},"werner","Fabian Werner",{"slug":3265,"name":3266},"wolke","Sören Wolke",["Reactive",3268],{"$scookieConsent":3269,"$ssite-config":3271},{"functional":3270,"analytics":3270},false,{"_priority":3272,"env":3276,"name":3277,"url":3278},{"name":3273,"env":3274,"url":3275},-10,-15,0,"production","nuxt-app","https://synyx.de",["Set"],["ShallowReactive",3281],{"category-database":-1,"authors":-1},"/blog/tags/database"]