[mySQL] transfering a huge database to a web site
Moderator: MOD_nyhetsgrupper
[mySQL] transfering a huge database to a web site
How would you transfer a huge database to a web site ?
Until now, I worked with small databases. But now, I would
like to put on the web site of my gen. soc. (limited to
members only) some genealogical databases.
I am working with PHPmyadmin installed on both my PC and the
web site with different set up (PC is in Win 98 and the web
site is in Linux). The database would be something like
35 Meg in SQL statements (or 12 to 15 Meg in zipped files).
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Denis
Until now, I worked with small databases. But now, I would
like to put on the web site of my gen. soc. (limited to
members only) some genealogical databases.
I am working with PHPmyadmin installed on both my PC and the
web site with different set up (PC is in Win 98 and the web
site is in Linux). The database would be something like
35 Meg in SQL statements (or 12 to 15 Meg in zipped files).
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Denis
Re: [mySQL] transfering a huge database to a web site
Try using the mysqldump & mysqlimport utilities; here is a page that
details copying a database to another machine, locally or remote:
http://dev.mysql.com/doc/mysql/en/upgra ... -arch.html
Here is a snippet:
You can also use mysqldump and mysqlimport to transfer the database. For
big tables, this is much faster than simply using mysqldump. In the
following commands, DUMPDIR represents the full pathname of the
directory you use to store the output from mysqldump.
First, create the directory for the output files and dump the database:
shell> mkdir DUMPDIR
shell> mysqldump --tab=DUMPDIR db_name
Then transfer the files in the DUMPDIR directory to some corresponding
directory on the target machine and load the files into MySQL there:
shell> mysqladmin create db_name # create database
shell> cat DUMPDIR/*.sql | mysql db_name # create tables in database
shell> mysqlimport db_name DUMPDIR/*.txt # load data into tables
Regards
John Ellingsworth
http://ellingsworth.org
Denis Beauregard wrote:
details copying a database to another machine, locally or remote:
http://dev.mysql.com/doc/mysql/en/upgra ... -arch.html
Here is a snippet:
You can also use mysqldump and mysqlimport to transfer the database. For
big tables, this is much faster than simply using mysqldump. In the
following commands, DUMPDIR represents the full pathname of the
directory you use to store the output from mysqldump.
First, create the directory for the output files and dump the database:
shell> mkdir DUMPDIR
shell> mysqldump --tab=DUMPDIR db_name
Then transfer the files in the DUMPDIR directory to some corresponding
directory on the target machine and load the files into MySQL there:
shell> mysqladmin create db_name # create database
shell> cat DUMPDIR/*.sql | mysql db_name # create tables in database
shell> mysqlimport db_name DUMPDIR/*.txt # load data into tables
Regards
John Ellingsworth
http://ellingsworth.org
Denis Beauregard wrote:
How would you transfer a huge database to a web site ?
Until now, I worked with small databases. But now, I would
like to put on the web site of my gen. soc. (limited to
members only) some genealogical databases.
I am working with PHPmyadmin installed on both my PC and the
web site with different set up (PC is in Win 98 and the web
site is in Linux). The database would be something like
35 Meg in SQL statements (or 12 to 15 Meg in zipped files).
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Denis
Re: [mySQL] transfering a huge database to a web site
"Denis Beauregard" <[email protected]> wrote in message
news:[email protected]...
You can use something like SQLYOG. I use it to tranfer my databases from my
laptop running windows to my server running Solaris. Haven't had any
problems yet and one of my databases is 40 MB. And SQLYOG is free for
personal use.
news:[email protected]...
How would you transfer a huge database to a web site ?
Until now, I worked with small databases. But now, I would
like to put on the web site of my gen. soc. (limited to
members only) some genealogical databases.
I am working with PHPmyadmin installed on both my PC and the
web site with different set up (PC is in Win 98 and the web
site is in Linux). The database would be something like
35 Meg in SQL statements (or 12 to 15 Meg in zipped files).
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Denis
You can use something like SQLYOG. I use it to tranfer my databases from my
laptop running windows to my server running Solaris. Haven't had any
problems yet and one of my databases is 40 MB. And SQLYOG is free for
personal use.
Re: [mySQL] transfering a huge database to a web site
On Wed, 23 Mar 2005 09:36:48 -0500, Denis Beauregard
<[email protected]> wrote in soc.genealogy.computing:
I got a suggestion from another newsgroup:
use mybase;
load data infile 'mytable.txt' into table mytable;
Tested with a 35 MB datafile. And tested with a path, i.e.
load data infile 'c:\\data\\bigbase\\mybase.txt' into table mytable;
Denis
<[email protected]> wrote in soc.genealogy.computing:
How would you transfer a huge database to a web site ?
Until now, I worked with small databases. But now, I would
like to put on the web site of my gen. soc. (limited to
members only) some genealogical databases.
I got a suggestion from another newsgroup:
use mybase;
load data infile 'mytable.txt' into table mytable;
Tested with a 35 MB datafile. And tested with a path, i.e.
load data infile 'c:\\data\\bigbase\\mybase.txt' into table mytable;
Denis
Re: [mySQL] transfering a huge database to a web site
Denis Beauregard wrote:
Copying the MySQL database files (the .frm, .myi and .myd files, as well
as their directory) *will* work, but for safety, you should first
shutdown the MySQL server on the source machine so that in-memory
changes will be flushed to the files.
Then you should shutdown the MySQL server on the destination machine.
Now copy the files to the server's data directory, and then re-start the
server. You should now find that it sees the new dataset(s).
Hope this helps.
David Harper
Cambridge, England
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Copying the MySQL database files (the .frm, .myi and .myd files, as well
as their directory) *will* work, but for safety, you should first
shutdown the MySQL server on the source machine so that in-memory
changes will be flushed to the files.
Then you should shutdown the MySQL server on the destination machine.
Now copy the files to the server's data directory, and then re-start the
server. You should now find that it sees the new dataset(s).
Hope this helps.
David Harper
Cambridge, England
Re: [mySQL] transfering a huge database to a web site
In message <4Wt0e.97$%[email protected]>, David Harper
<[email protected]> writes
sure you're not overwriting any files - i.e. it's a new table. Make sure
that the owner, group and modes are correct e.g. mysql:mysql and 660 and
then you'll be able to query the new table. Use 'rename table' to
replace one table with another - should be atomic. 'flush tables' first
would be a good idea.
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
Chad
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
<[email protected]> writes
Denis Beauregard wrote:
Is there any mean to transfer the database in another method
than the phpmyadmin interface (which is limited to about 2 Meg
files) ? I tried copying directly the mysql files to my
other portable computer (it has Linux too) but it doesn't see
the new dataset, like if there were some description of the
dataset).
Copying the MySQL database files (the .frm, .myi and .myd files, as
well as their directory) *will* work, but for safety, you should first
shutdown the MySQL server on the source machine so that in-memory
changes will be flushed to the files.
Then you should shutdown the MySQL server on the destination machine.
Now copy the files to the server's data directory, and then re-start
the server. You should now find that it sees the new dataset(s).
Hope this helps.
David Harper
Cambridge, England
Shouldn't be necessary to shutdown the destination MySQL server. Make
sure you're not overwriting any files - i.e. it's a new table. Make sure
that the owner, group and modes are correct e.g. mysql:mysql and 660 and
then you'll be able to query the new table. Use 'rename table' to
replace one table with another - should be atomic. 'flush tables' first
would be a good idea.
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
Chad
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
Re: [mySQL] transfering a huge database to a web site
Denis Beauregard wrote:
If you've got shell access, you can export a MySQL database with the
command
mysqldump --user=<username> --password --add-drop-table \
<database> > <filename>.sql
To import the same database, use:
mysql -u <username> -p <database> < <filename>.sql >/dev/null
--
Leif Biberg Kristensen
http://solumslekt.org/
How would you transfer a huge database to a web site ?
If you've got shell access, you can export a MySQL database with the
command
mysqldump --user=<username> --password --add-drop-table \
<database> > <filename>.sql
To import the same database, use:
mysql -u <username> -p <database> < <filename>.sql >/dev/null
--
Leif Biberg Kristensen
http://solumslekt.org/
Re: [mySQL] transfering a huge database to a web site
"Chad Hanna" <[email protected]> wrote in message
news:[email protected]...
FreeBMD has some rather larger tables in MySQL, and we replicate the
database from machine to machine using rsync (with flush-tables as required)
news:[email protected]...
Shouldn't be necessary to shutdown the destination MySQL server. Make sure
you're not overwriting any files - i.e. it's a new table. Make sure that
the owner, group and modes are correct e.g. mysql:mysql and 660 and then
you'll be able to query the new table. Use 'rename table' to replace one
table with another - should be atomic. 'flush tables' first would be a
good idea.
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
FreeBMD has some rather larger tables in MySQL, and we replicate the
database from machine to machine using rsync (with flush-tables as required)
Re: [mySQL] transfering a huge database to a web site
In message <[email protected]>, Dave Mayall
<[email protected]> writes
OK your's (meaning FreeBMD not FreeCEN of course) is bigger than mine
though I was referring to a LOAD file - Yorkshire 1881 census I
believe - not a table.
I must admit I tend to use merge tables (.MRG) fairly extensively to
keep each of 20 or so individual compressed tables down to about a
gigabyte in size. Even after tweaking mysql's huge configuration there
seemed to be a big drop off in the speed of big alter statements above a
gigabyte or so. At least there was in MySQL 3 - 4 may be more tweakable.
Of course, I spent too long fighting a problem with a duff chip set on
two of our servers (no names no pack drill). This meant intensive disk
operations like alters often gave errors and smaller files was one way
of coping.
I've used replication to keep two servers in sync, esp. the financial
stuff.
Of course my main wish is for more disk spindles!
I know we're talking about Genealogy data, but I think we may be boring
our listeners
Cheers, Chad
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
<[email protected]> writes
"Chad Hanna" <[email protected]> wrote in message
news:[email protected]...
Shouldn't be necessary to shutdown the destination MySQL server. Make sure
you're not overwriting any files - i.e. it's a new table. Make sure that
the owner, group and modes are correct e.g. mysql:mysql and 660 and then
you'll be able to query the new table. Use 'rename table' to replace one
table with another - should be atomic. 'flush tables' first would be a
good idea.
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
FreeBMD has some rather larger tables in MySQL, and we replicate the
database from machine to machine using rsync (with flush-tables as required)
Hi Dave,
OK your's (meaning FreeBMD not FreeCEN of course) is bigger than mine

believe - not a table.
I must admit I tend to use merge tables (.MRG) fairly extensively to
keep each of 20 or so individual compressed tables down to about a
gigabyte in size. Even after tweaking mysql's huge configuration there
seemed to be a big drop off in the speed of big alter statements above a
gigabyte or so. At least there was in MySQL 3 - 4 may be more tweakable.
Of course, I spent too long fighting a problem with a duff chip set on
two of our servers (no names no pack drill). This meant intensive disk
operations like alters often gave errors and smaller files was one way
of coping.
I've used replication to keep two servers in sync, esp. the financial
stuff.
Of course my main wish is for more disk spindles!
I know we're talking about Genealogy data, but I think we may be boring
our listeners

Cheers, Chad
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
Re: [mySQL] transfering a huge database to a web site
On Thu, 24 Mar 2005 10:41:13 +0000, Chad Hanna
<[email protected]> wrote in soc.genealogy.computing:
load data is working on my local computer (Win 98 + EasyPHP)
but produces an error 1045 on the remote server (Linux Fedora).
I read the mySQL documentation on page
http://dev.mysql.com/doc/mysql/en/grant.html but I don't see
what I do wrong. Moreover, it says the GRANT FILE will enable
SELECT and I can already do SELECT in PHPadmin, so I must already have
that privilege. Where I try the GRANT command, it complaints with the
same 1045 error message:
request SQL :
GRANT FILE ON * . * TO xxxx@localhost
MySQL answererd:
#1045 - Access denied for user: 'yyyy@zzzz' (Using password: YES)
Denis
<[email protected]> wrote in soc.genealogy.computing:
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
load data is working on my local computer (Win 98 + EasyPHP)
but produces an error 1045 on the remote server (Linux Fedora).
I read the mySQL documentation on page
http://dev.mysql.com/doc/mysql/en/grant.html but I don't see
what I do wrong. Moreover, it says the GRANT FILE will enable
SELECT and I can already do SELECT in PHPadmin, so I must already have
that privilege. Where I try the GRANT command, it complaints with the
same 1045 error message:
request SQL :
GRANT FILE ON * . * TO xxxx@localhost
MySQL answererd:
#1045 - Access denied for user: 'yyyy@zzzz' (Using password: YES)
Denis
Re: [mySQL] transfering a huge database to a web site
In message <[email protected]>, Denis
Beauregard <[email protected]> writes
server but I'd have thought you would get a different error.
Use "show variables like 'local_infile' " to find out.
If you can't issue a grant it makes me think that you haven't been
granted the privilege (with grant) to do so.
I'm on MySQL 4.0.16 but I don't know what's included in Fedora.
Cheers
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
Beauregard <[email protected]> writes
On Thu, 24 Mar 2005 10:41:13 +0000, Chad Hanna
[email protected]> wrote in soc.genealogy.computing:
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
load data is working on my local computer (Win 98 + EasyPHP)
but produces an error 1045 on the remote server (Linux Fedora).
I read the mySQL documentation on page
http://dev.mysql.com/doc/mysql/en/grant.html but I don't see
what I do wrong. Moreover, it says the GRANT FILE will enable
SELECT and I can already do SELECT in PHPadmin, so I must already have
that privilege. Where I try the GRANT command, it complaints with the
same 1045 error message:
request SQL :
GRANT FILE ON * . * TO xxxx@localhost
MySQL answererd:
#1045 - Access denied for user: 'yyyy@zzzz' (Using password: YES)
Denis
You probably know that 'load data local infile' can be disabled in the
server but I'd have thought you would get a different error.
Use "show variables like 'local_infile' " to find out.
If you can't issue a grant it makes me think that you haven't been
granted the privilege (with grant) to do so.
I'm on MySQL 4.0.16 but I don't know what's included in Fedora.
Cheers
--
Chad Hanna
Chairman Berkshire Family History Society http://www.berksfhs.org.uk
Quality Family History Data http://www.familyhistoryonline.net
Re: [mySQL] transfering a huge database to a web site
On Thu, 7 Apr 2005 18:02:42 +0100, Chad Hanna
<[email protected]> wrote in soc.genealogy.computing:
I didn't know. I am a beginner in SQL...
I put the statement in the phpadmin box but it shown no result.
Then I tried "show variables like '%' " and it displayed a long
list of variables. So, local_infile is not defined.
I think the version is
Client API version 3.23.58 (I don't see anything else that may look
like a release number in the phpinfo() output.
PHP is PHP/4.3.4.
So, the function is not available at all ? What else can I do ?
My dataset is 55 MB huge and I don't want to run 30 times a set of
SQL inserts...
I will try the command line dump suggested in another message.
I found the .exe file in Easyphp and the same commands are available
in the shell too. Hopefully, this should do the job...
Denis
<[email protected]> wrote in soc.genealogy.computing:
In message <[email protected]>, Denis
Beauregard <[email protected]> writes
On Thu, 24 Mar 2005 10:41:13 +0000, Chad Hanna
[email protected]> wrote in soc.genealogy.computing:
I must admit I tend to use 'load data' - biggest file 485 Mbytes.
load data is working on my local computer (Win 98 + EasyPHP)
but produces an error 1045 on the remote server (Linux Fedora).
I read the mySQL documentation on page
http://dev.mysql.com/doc/mysql/en/grant.html but I don't see
what I do wrong. Moreover, it says the GRANT FILE will enable
SELECT and I can already do SELECT in PHPadmin, so I must already have
that privilege. Where I try the GRANT command, it complaints with the
same 1045 error message:
request SQL :
GRANT FILE ON * . * TO xxxx@localhost
MySQL answererd:
#1045 - Access denied for user: 'yyyy@zzzz' (Using password: YES)
Denis
You probably know that 'load data local infile' can be disabled in the
server but I'd have thought you would get a different error.
I didn't know. I am a beginner in SQL...
Use "show variables like 'local_infile' " to find out.
I put the statement in the phpadmin box but it shown no result.
Then I tried "show variables like '%' " and it displayed a long
list of variables. So, local_infile is not defined.
If you can't issue a grant it makes me think that you haven't been
granted the privilege (with grant) to do so.
I'm on MySQL 4.0.16 but I don't know what's included in Fedora.
I think the version is
Client API version 3.23.58 (I don't see anything else that may look
like a release number in the phpinfo() output.
PHP is PHP/4.3.4.
So, the function is not available at all ? What else can I do ?
My dataset is 55 MB huge and I don't want to run 30 times a set of
SQL inserts...
I will try the command line dump suggested in another message.
I found the .exe file in Easyphp and the same commands are available
in the shell too. Hopefully, this should do the job...
Denis