solved Question about DB Backups

iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I just ran a manual backup of my site using the Backup module. I kicked off a manual backup for the database and Modules/skins.

When i log into S3, I only see it for Modules and skins. Is there supposed to be a listing for Database also?

Thank you in advance!
Brian
updated by @ilovehousemusic: 04/03/14 01:26:03AM
brian
@brian
10 years ago
10,148 posts
Yeah you should see a dump of each table as an sql file. I just checked here and it all seems to be working...


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I dont see the SQL file on my side. Wonder what's up with that... I'll check around.

Actually, I think you answered my questions already here:
https://www.jamroom.net/the-jamroom-network/documentation/modules/1510/jrbackup

So in my scenario, my site is some 90GB. The first backup will take a loooong time, then each one after that only the changed files.
brian
@brian
10 years ago
10,148 posts
iLoveHouseMusic:
I dont see the SQL file on my side. Wonder what's up with that... I'll check around.

Actually, I think you answered my questions already here:
https://www.jamroom.net/the-jamroom-network/documentation/modules/1510/jrbackup

So in my scenario, my site is some 90GB. The first backup will take a loooong time, then each one after that only the changed files.

Oh yeah - give it time - it will take a while to transfer 90 gigs over to S3.


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I just want to keep this thread alive. I get this error in my log:
<span class="php_Warning">[05-Feb-2014 22:36:31 UTC] PHP Warning: S3::putObject(): [InternalError] We encountered an internal error. Please try again. in /home/xxxxxx/public_html/modules/jrBackup-release-1.0.3/contrib/S3/S3.php on line 326 [x 1]

In S3, I only see skinsxxx.zip, modulesxxx.zip files and the /media folder.

I'm going to poke around a bit more, but wanted to get your thoughts on that?
updated by @ilovehousemusic: 02/07/14 04:06:57PM
brian
@brian
10 years ago
10,148 posts
That error means S3 responded with a problem, so the file could not be "put".

Note that we've been running the S3 backup here for months on jamroom.net, and I run it as well a couple of my sites, and I do see this type of error every once in a while (maybe once every couple of weeks).

Has your media backup completed successfully? For the size of your site it may take a week or so before the main media backup is fully completed, and we may want to move the database dump in front of the media backup in a future release.

Hope this helps!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
brian:
Has your media backup completed successfully? For the size of your site it may take a week or so before the main media backup is fully completed, and we may want to move the database dump in front of the media backup in a future release.
Hope this helps!

I will verify that. Won't there be issues then, if I set it to daily backup, but the first full pass has not completed?
The only way then, to accomplish that would be to disable the daily backup, and then run a manual full backup. Which requires me to be logged in and have the browser open. In which case, it has timed out on me each time I've tried.

So, I'm relying on the automated daily backup to do it's job. Perhaps this is the issue. The first pass is taking too long?
brian
@brian
10 years ago
10,148 posts
One of the problems of going from "0 to 100 gigs" overnight is that it could easily take more than a day to backup, and so I'm not sure if there is an easy solution for this to make it all work the very first time. Each day that it runs it should get further along in the process however - can you verify where it is in your media?

I also have confirmed that currently it backs up the DB tables first, then moves on to profiles.


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
Thanks, let me find out how far it's getting...
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I can see my media folders numbered 1-15, with the newest one being from 2/15. I'll get back to the idea that my site hasn't fully backed up (media) yet.

My concern at the moment is I'm not seeing the DB backed up. I kick off a manual backup of the database only, and I don't see it go into S3.

I reloaded the module with similar result.

I dont see anything out of the ordinary in the logs, but next to the "manual backup completed" log in PHP, I click the "!" button to the right and I get this output:

Message [xxxxxxx]: manual system backup started
Date 02/17/14 13:58:25
IP Address xx.xx.xx.xx
URL /
Memory 3MB
Data
ffmpeg version N-50408-gdc666d3 Copyright (c) 2000-2013 the FFmpeg developers
built on Feb 28 2013 23:01:11 with gcc 4.3.2 (Debian 4.3.2-1.1)
configuration: --enable-shared --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --cc=cc --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame
libavutil 52. 17.103 / 52. 17.103
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.102 / 54. 63.102
libavdevice 54. 3.103 / 54. 3.103
libavfilter 3. 41.100 / 3. 41.100
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
[mp3 @ 0x106435e0] Format mp3 detected only with low score of 1, misdetection possible!
[mp3 @ 0x10643f20] Header missing
[mp3 @ 0x106435e0] decoding for stream 0 failed
[mp3 @ 0x106435e0] Could not find codec parameters for stream 0 (Audio: mp3, 0 channels, s16p): unspecified frame size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
[mp3 @ 0x106435e0] Estimating duration from bitrate, this may be inaccurate
/home/xxxx/public_html/data/media/1/152/jrAudio_606_audio_file.mp3: could not find codec parameters

I dont know how that's related... Can I validate anything else to try to get this working?
brian
@brian
10 years ago
10,148 posts
The daily backup runs as a "shutdown process" from the event queue, which also can process other items (such as a conversion) - I don't think this is actually related, it just happened to cause an error during that queue process which was kicked off by daily maintenance.

If you're at media dir 15, then that is 15,000+ profiles - it likely could still be trying to complete.


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
OK back to this one. I'm testing this out again with the hopes that the entire site has backed up by now. I'm going to do a manual backup of only the DB and see if it pops into my bucket.
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
Hmmm no dice. Is this something you think you could look at also?
brian
@brian
10 years ago
10,148 posts
iLoveHouseMusic:
Hmmm no dice. Is this something you think you could look at also?

I'm fairly certain that due to the size of your system, it is just not able to fully complete a backup before it starts over. In order for this module to work on your system it's probably going to have to be re-engineered to maintain checkpoints. That's something I can look into when I get a chance.

Thanks!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I follow you on that part. I was referring to the inability to get a successful database (only) manual backup. Or is that related to the backup not completing the first pass? Should be separate no?
brian
@brian
10 years ago
10,148 posts
Yeah that should be completing no problem. I've still got your server login here - is it OK for me to check it out?

Thanks!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
Yes, go for it, thank you!
brian
@brian
10 years ago
10,148 posts
OK I found the root cause with the DB not being backed up, and have fixed it in version 1.1.0 which is in the marketplace. I updated your site, and have run a DB and module/skins backup no problem.

I also manually was running a profile backup, and it was around profile_id 14,000 when my system locked up here and I had to reboot, but it looks like it was working no problem as well. Try running the profile backup again manually and it should complete (it will take some time, but should complete).

Hope this helps!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
@brian - thanks for your help on this. Let me check AWS console and I'll report back.
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
I see all the tables now in my S3 bucket!! WOOHOO!!

One last question - Does the API have the ability to pull the total bucket size into the Jamroom module? I think if that could be displayed the JR user could get an idea if their data is being backed up AND have an idea of what it's costing them.

I searched Amazon Docs but could not find info about seeing your bucket properties. It seems like a simple feature to add but I can't seem to find out how to determine existing bucket size.

TIA
brian
@brian
10 years ago
10,148 posts
There might be, I can check it out.

Thanks!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
It looks like it has to be calculated/tallied. Like, there's no single piece of data you can pull down. http://stackoverflow.com/questions/8975959/aws-s3-how-do-i-see-how-much-disk-space-is-using
brian
@brian
10 years ago
10,148 posts
iLoveHouseMusic:
It looks like it has to be calculated/tallied. Like, there's no single piece of data you can pull down. http://stackoverflow.com/questions/8975959/aws-s3-how-do-i-see-how-much-disk-space-is-using

Yeah then that's not something I would add to the module - you'll just need to check out your S3 account.

Thanks!


--
Brian Johnson
Founder and Lead Developer - Jamroom
https://www.jamroom.net
iLoveHouseMusic
iLoveHouseMusic
@ilovehousemusic
10 years ago
517 posts
No problem, thanks for looking into it and thanks for your help w/ the backup issues!

Tags