Jump to content


Helix Core Backup Script

perforce backup

  • Please log in to reply
3 replies to this topic

#1 ian.morris@trad.fi

ian.morris@trad.fi

    Newbie

  • Members
  • Pip
  • 5 posts

Posted 29 January 2019 - 02:57 PM

Hi

I'm trying to write a simple backup script to do the following with Helix Core

1: Create a journal (p4d -jc)
2. Stop perforce
3. Copy files in the Perforce root directory to another location
4. Start perforce

I thought this would be easy but being new to Perforce I'm struggling a little with it. I end with .db files in the wrong place when I start Perforce, permission issues, connections refused, etc. Too many issues to go into detail about. If anyone already has some kind of script that does something similar that may save me some time.

Thanks

#2 Sambwise

Sambwise

    Advanced Member

  • Members
  • PipPipPip
  • 784 posts

Posted 29 January 2019 - 10:50 PM

As far as I know the Perforce SDP is the standard if you want a premade script: https://swarm.worksh...e-software-sdp/

If you want to just fix up your own script, though, it sounds like all that's happening is that you aren't setting the server environment variables.  This works a bit differently on Windows vs Unix; on Windows you need to do stuff like "p4 set -S Perforce P4ROOT=C:\your\root", and on Unix you can do it via environment variables but I think more typically you just embed it in your startup script like "p4d -r /usr/perforce/root -p 1666".  If you're passing those flags, make sure to pass them to every invocation of p4d (including "p4d -jc" and similar commands).

If you start Perforce in the wrong root location, it'll just make a new empty set of db files and essentially start up a whole new server instance.  Doesn't really hurt anything permanently but obviously you can't do anything with your "real" server instance when the server daemon itself is serving up a whole different one.  "p4 info" is a useful debugging tool since it'll tell you what server root p4d is currently using (among a bunch of other pertinent data).

#3 ian.morris@trad.fi

ian.morris@trad.fi

    Newbie

  • Members
  • Pip
  • 5 posts

Posted 30 January 2019 - 09:17 AM

Perforce SDP looks like a great resource and I hadn't seen that before. I installed it on a clean Linux server but when I ran any P4 commands it couldn't find them. Didn't want to spend time getting that working but will revisit another time.

For now I have a basic backup script which I will put in the cron.daily folder.

p4 verify -q //...
p4 admin checkpoint
p4dctl stop -a
day=$(date +"%d-%m-%Y")
aws s3 cp ../../opt/perforce/servers/master/ s3://mys3bucket/perforce_backup/$day/ --recursive
p4dctl start -a

#4 Matt Janulewicz

Matt Janulewicz

    Advanced Member

  • Members
  • PipPipPip
  • 155 posts
  • LocationSan Francisco, CA

Posted 08 March 2019 - 12:20 AM

I second the SDP idea. It's great. Everyone should use it, no matter how small your server.

We can't afford the downtime for running a checkpoint (it's an hours-long process for us) so we do everything in a second, offline copy of the database.

Once a week we 'swap in' the clean offline database, giving us about 10 seconds of downtime per server a week, no matter how big our db files get.

One thing that's probably messing you up right off the bat is that the SDP doesn't mess with what I'd call the 'standard' environment, at least not on Linux. It doesn't copy binaries to /usr/bin or anything like that, so you'd have to put the SDP bin path in your own $PATH. I just stick this in all my server user's ~/.bash_profile:

export PATH=/depotdata/p4/common/bin:$PATH
-Matt Janulewicz
Staff SCM Engineer, Perforce Administrator
Dolby Laboratories, Inc.
1275 Market St.
San Francisco, CA 94103, USA
majanu@dolby.com




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users