English 中文(简体)
我可以进口一个 Mysql 垃圾堆 到拉瓦路移民?
原标题:Can I import a mysql dump to a laravel migration?
  • 时间:2014-01-23 11:56:36
  •  标签:
  • laravel
I have a complete database and need to create migration. I guess there must be a way to do it from a dump but not sure. Is there any way automatically or at least easier to do this task?
最佳回答
If you can dump to a CSV: An alternative for some generic data tables (Countries, States, Postal Codes), not via migrations but via seeders. Although you could do it the same way in a migration file. In your seeder file: public function run() { $this->insertFromCsvFile( countries , path/to/countries.csv ); $this->insertFromCsvFile( states , path/to/states.csv ); $this->insertFromCsvFile( postal_codes , path/to/postal_codes.csv ); } private function insertFromCsvFile($tableName, $filePath) { if( !file_exists($filePath) ){ echo File Not Found: .$filePath." "; return; } $headers = $rows = []; $file = fopen( $filePath, r ); while( ( $line = fgetcsv( $file ) ) !== false ){ // The first row should be header values that match column names. if( empty( $headers ) ){ $headers = explode( , , implode( , , $line ) ); continue; } $row = array_combine( $headers, $line ); foreach( $row as &$val ) if( $val === NULL ) $val = null; $rows[] = $row; // Adjust based on memory constraints. if( count($rows) === 500 ){ DB::table( $tableName )->insert($rows); $rows = []; } } fclose( $filePath ); if( count($rows) ) DB::table( $tableName )->insert($rows); } Run the seeder: php artisan db:seed --class=GenericTableSeeder
问题回答
You can import dumps in Laravel like this: DB::unprepared(file_get_contents( full/path/to/dump.sql )); If I were to refactor an existing app, though, I d take the time to write migrations from scratch, import the dump into different tables (or a different db, if table names are the same) then import the content to the new structure via seeds.
Laravel can t do that, but I think this will help: Laravel migration generator It generate migrations based on existing tables.
This question is answered already, but recently in a project, the provided answers did not satisfy my needs any longer. It also does not import a whole database dump, but one (large) table. I felt I should share that with you. The problem was, I wanted to import a quite large table (list of zipcodes) during my artisan:migrate operation. The solution with DB::unprepared($dump) took way to long and I found an alternative which is MUCH faster. Just export your table as CSV and use the following Code in your migration s up() function. // i had to str_replace the backslash on windows dev system... but works on linux, too $filename = str_replace("\", "/", storage_path( path/in/storage/to/your/file.csv )); $query = "LOAD DATA LOCAL INFILE ".$filename." INTO TABLE yourtable FIELDS TERMINATED BY ENCLOSED BY LINES TERMINATED BY IGNORE 0 LINES (col1,col2,...);"; DB::unprepared($query); Just update the query as you need. And of course, you should make sure, that the table with the cols col1 , col2 etc... exists. I created it just before the importing of the file. with Schema::create()... If you run into following error message: PDO::exec(): LOAD DATA LOCAL INFILE forbidden There is a way you can get rid of this message: Although it s not really documented you can just add an options key to your config/database.php file. For example mine looks like that: mysql => [ driver => mysql , host => env( DB_HOST , localhost ), database => env( DB_DATABASE , forge ), username => env( DB_USERNAME , forge ), password => env( DB_PASSWORD , ), charset => utf8 , collation => utf8_unicode_ci , prefix => , strict => false, options => array( PDO::MYSQL_ATTR_LOCAL_INFILE => true, ) Note: i m currently using laravel 5 but it should work with laravel 4, too.
I have a complete database and need to create migration. I guess there must be a way to do it from a dump but not sure. Is there any way automatically or at least easier to do this task? Not automatically, but we run dumps in a migration using DB::unprepared(). You could use file_get_contents to import from a .sql file and thus not have to worry about escaping the entire dump s " marks...
Another alternative is using the PDO directly: use IlluminateSupportFacadesDB; use IlluminateSupportFacadesFile; $sql_dump = File::get( /path/to/file.sql ); DB::connection()->getPdo()->exec($sql_dump);
I am writing my answer as this might help to someone who is using new laravel 8. In laravel 8, dumping SQL and run migration using SQL(.dump) file is possible. Please refer below link for more detail. https://laravel.com/docs/8.x/migrations#squashing-migrations php artisan schema:dump // Dump the current database schema and prune all existing migrations... php artisan schema:dump --prune schema:dump will create new directory under database > schema and SQL dump will stored there. After that when you try to migrate first it will run dump file from schema and then any pending migration.
another solution work for me in Laravel 5.2: DB::unprepared(File::get( full/path/to/dump.sql ));
Simple solution provided by Laravel Article for generating migration file from an existing database table. Try: https://laravelarticle.com/laravel-migration-generator-online
You can create laravel migration and models directly from database using https://github.com/XCMer/larry-four-generator Execute the following code after installing the package php artisan larry:fromdb
i recently standing in front of the same problem. i didn t want to install a package specially for that, so i decided to write a little tool to help me and others ;) Here is the link: http://laravel.stonelab.ch/sql-seeder-converter/ And here you can comment it, if you have any improvement proposals or questions: http://www.stonelab.ch/en/sql-to-laravel-seeder-converter/
You can use Raahul/Larryfour Package, A model and migration generator for Laravel 4 Raahul/Larryfour Package After insallation you can use a command line to create a migration from existed database like this: php artisan raahul:fromdb --only yourdatabase And you will find the migration in app/migrations/ folder
I think the best way to solve this is to use mysql to import the file to avoid running out of memory. $host = env( DB_HOST ); $user = env( DB_USERNAME ); $password = env( DB_PASSWORD ); $database = env( DB_DATABASE ); $fpath = Storage::path( database_dumps/dump.sql ); $command = "mysql -h$host -u$user -p$password $database < $fpath"; info("running: $command"); $process = Process::run($command); if ($process->failed()) { Log::error($process->errorOutput()); } else { Log::info($process->output()); }




相关问题
Table of calificacions in php and laravel

I created a table in order to show the name of the students, name of the subjects and the grade of the subject of the student I have the next code: this is the index part <form action="{{ ...

How to fix CORS for Capacitor/Quasar App in a Laravel API

I have an App built on top of Quasar and Capacitor, with an API built on top of Laravel. Everything works well on the dev environment until I build and install the app on my phone, then I keep getting ...

Laravel +Vuejs CORS issue

I am facing a dilemma wit the CORS issue. I have application that is configured as follow: API: https://abc.mydomain.com (laravel) front end: https://subdomain.mydomain.com(VueJS) Both applications ...

Laravel wrong redirecting to login page after login

In a multi tenancy with subdomain enviroment (http://zapburger.dex.local) when user fills login form with valid data and make login, laravel redirect to dashboard route (as expected) and then redirect ...