Website Backups: An Ethical Necessity?

Our website has been up for some time, but I haven’t made many changes to it. Adding videos and a few other things are on my to-do list, but I’m glad I haven’t done it yet. This past weekend my partner and I attended a PA Bar Association Young Lawyer’s retreat. During one of the CLEs, a panelist pointed out that the Pennsylvania Rules of Professional Conduct require attorneys to keep a record of every advertisement. No news there, we keep copies of anything we send out. But she presented an interesting question: what about websites?

You can discuss The Shingle Life in the comments, in the LAB, or on Twitter using the hashtag #shinglelife.

The Rule

Pennsylvania Rule of Professional Conduct 7.2(b) states:

A copy or recording of an advertisement or written communication shall be kept for two
years after its last dissemination along with a record of when and where it was used. This record shall
include the name of at least one lawyer responsible for its content.

My reading of this rule is that websites, which are advertisements, have to copied and retained for two years after the last dissemination. As for “when and where it was used,” I think a record of the URL should be sufficient

The Solution

I don’t believe page version history is sufficient for keeping a “copy or recording” of the website. In WordPress, you can see the various revisions each page has gone through. But if you take the site down completely and no longer own the URL, you won’t have access to this information. The only thing I could think of was saving each page as a .PDF. No small task I can assure you. And that’s for a relatively small website. We only have about twenty pages on the whole site, so I spent my morning opening each page, printing it to a .PDF, saving it, titling it, and moving on to the next page.

By the end I had no motivation to create more content, since that would lead to more backups in the future. Luckily, I stumbled on a significantly easier solution. Assuming you have Acrobat Pro, all you have to do is go to File–>Create PDF–>From Webpage and you will get a dialog box. Put in your URL and make sure to click “Capture Multiple Levels.” Check the box that says capture entire site, and you’re good to go.

Adobe warned me that it would take up a good chunk of time and hard drive space. But by the time I made some coffee and came back, the whole operation was done. For our twenty page site the whole file was 210kb. Not too bad.

Does your jurisdiction require a copy of advertising? If so, do you keep a copy of your website?


  1. Avatar Adam Lilly says:

    I export a copy of my site to my computer as an XML document any time I make substantive revisions (and now, as those become less frequent, any revisions). I would assume that would qualify, and it’s really easy to do in WordPress. I do it both because of a similar ethical rule, and (more practically) for backup in case anything happens to the site – which I think you should consider doing as well.

  2. I use Vaultpress for all of my wordpress sites. 15 bucks a month and it creates a continuous backup of your site. Run by the same guys that created worpress and run Installs and configures in about 5 minutes via a plugin.

    I had a database corrupt one time and I went to, clicked restore to a point about 30 minutes before, and was back up and running in less than 10 minutes.

    You can also download a website copy from their server at any time as well.

    Set it and forget it simple, which works for me because I am prone to the forgetting it part.

  3. Avatar Krueger says:

    umm, you printed each page separately to Acrobat? Wow. What a waste of time.

    Next time, just tell Acrobat to capture the website. It will save the whole dang thing with one command. I don’t know which version you are using but in Acrobat X, simply go to Files | Create | PDF From Web Page and then type in your url….

    This will preserve the links, and like any file, it will have a date stamp. And you are done.

  4. Avatar Steve Mayne says:

    Regardless of the solution you use, make sure that you:

    1) Verify each backup you take (to ensure it’s not corrupt, and indeed that it’s actually working)
    2) Make sure you run backups regularly (and ideally in an automated manner) – Backups are no good unless they are timely.

  5. Avatar Chris says:

    If you’re careful about your parameters and using OS X, linux or can find the Windows binary then either curl or wget will allow you to snapshot an entire website fairly quickly, it’s not as pretty as PDFs but the html and images obtained are the same and can reproduce the original site.

    I used the same tool along with others to regularly back up work and if you tar the output and point that output at a dropbox monitored directory (or another backup) it’ll be saved for posterity automatically.

Leave a Reply