is there a possibility to execute an sql script, after EclipseLink generated the ddl?
In other words, is it possible that the EclipseLink property "eclipselink.ddl-generation" with "drop-and-create-tables" is used an开发者_Python百科d EclipseLink executes another sql-file (to insert some data into some tables just created) after creating the table definition?I'm using EclipseLink 2.x and JPA 2.0 with GlassFish v3.
Or can I init the tables within a java method which is called on the project (war with ejb3) deployment?
I came across this question for the same reasons, trying to find an approach to run an initialization script after DDL generation. I offer this answer to an old question in hopes of shortening the amount of "literary research" for those looking for the same solution.
I'm using GlassFish 4 with its default EclipseLink 2.5 JPA implementation. The new Schema Generation feature under JPA 2.1 makes it fairly straightforward to specify an "initialization" script after DDL generation is completed.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
<persistence-unit name="cbesDatabase" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>java:app/jdbc/cbesPool</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/sql/load_script.sql"/>
<property name="eclipselink.logging.level" value="FINE"/>
</properties>
</persistence-unit>
</persistence>
The above configuration generates DDL scripts from metadata (i.e. annotations) after which the META-INF/sql/load_script.sql
script is run to populate the database. In my case, I seed a few tables with test data and generate additional views.
Additional information on EclipseLink's use of JPA's properties can be found in the DDL Generation section of EclipseLink/Release/2.5/JPA21. Likewise, Section 37.5 Database Schema Creation in Oracle's Java EE 7 Tutorial and TOTD #187 offer a quick introduction also.
Have a look at Running a SQL Script on startup in EclipseLink that describes a solution presented as a kind of equivalent to Hibernate's import.sql
feature1. Credits to Shaun Smith:
Running a SQL Script on startup in EclipseLink
Sometimes, when working with DDL generation it's useful to run a script to clean up the database first. In Hibernate if you put a file called "import.sql" on your classpath its contents will be sent to the database. Personally I'm not a fan of magic filenames but this can be a useful feature.
There's no built in support for this in EclipseLink but it's easy to do thank's to EclipseLink's high extensibility. Here's a quick solution I came up with: I simply register an event listener for the session postLogin event and in the handler I read a file and send each SQL statement to the database--nice and clean. I went a little further and supported setting the name of the file as a persistence unit property. You can specify this all in code or in the persistence.xml.
The
ImportSQL
class is configured as aSessionCustomizer
through a persistence unit property which, on thepostLogin
event, reads the file identified by the "import.sql.file" property. This property is also specified as a persistence unit property which is passed tocreateEntityManagerFactory
. This example also shows how you can define and use your own persistence unit properties.import org.eclipse.persistence.config.SessionCustomizer; import org.eclipse.persistence.sessions.Session; import org.eclipse.persistence.sessions.SessionEvent; import org.eclipse.persistence.sessions.SessionEventAdapter; import org.eclipse.persistence.sessions.UnitOfWork; public class ImportSQL implements SessionCustomizer { private void importSql(UnitOfWork unitOfWork, String fileName) { // Open file // Execute each line, e.g., // unitOfWork.executeNonSelectingSQL("select 1 from dual"); } @Override public void customize(Session session) throws Exception { session.getEventManager().addListener(new SessionEventAdapter() { @Override public void postLogin(SessionEvent event) { String fileName = (String) event.getSession().getProperty("import.sql.file"); UnitOfWork unitOfWork = event.getSession().acquireUnitOfWork(); importSql(unitOfWork, fileName); unitOfWork.commit() } }); }
public static void main(String[] args) { Map<String, Object> properties = new HashMap<String, Object>(); // Enable DDL Generation properties.put(PersistenceUnitProperties.DDL_GENERATION, PersistenceUnitProperties.DROP_AND_CREATE); properties.put(PersistenceUnitProperties.DDL_GENERATION_MODE, PersistenceUnitProperties.DDL_DATABASE_GENERATION); // Configure Session Customizer which will pipe sql file to db before DDL Generation runs properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER, "model.ImportSQL"); properties.put("import.sql.file","/tmp/someddl.sql"); EntityManagerFactory emf = Persistence .createEntityManagerFactory("employee", properties); }
I'm not sure it's a strict equivalent though, I'm not sure the script will run after the database generation. Testing required. If it doesn't, maybe it can be adapted.
1 Hibernate has a neat little feature that is heavily under-documented and unknown. You can execute an SQL script during the SessionFactory creation right after the database schema generation to import data in a fresh database. You just need to add a file named import.sql in your classpath root and set either create or create-drop as your hibernate.hbm2ddl.auto property.
This might help as there is a confusion here: Use exactly the same set of properties (except logger) for data seeding.
DO NOT USE:
<property name="eclipselink.ddl-generation" value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
DO USE:
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
I confirm this worked for me.
:) Just substitue with your data
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata-then-script"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/seed.sql"/>
It is called BEFORE ddl-execution. And there seems to be no nice way to adapt it, as there is no suitable event one could use.
This process offers executing sql before DDL statments whereas what would be nice (for example, to insert seed data ) is to have something which executes after DDL statements. I don't if I am missing something here. Can somebody please tell me how to execute sql AFTER eclipselink has created tables (when create-tables property is set to tru)
精彩评论