Offline copy for worst case scenario #16826
Replies: 1 comment
-
A direct database dump will give you the maximum fidelity copy and can also be used for disaster recovery. You can make this a read-only copy if you set MAINTENANCE_MODE so that logins don't try to update the last login table. Things like media (images) and scripts/reports need to be copied too though, unless they are on S3. If you only want spreadsheet-style tables that you can refer to offline, then use the "export" functions, either as-is or with export templates that you create yourself. Then just hit the relevant export URLs in your cronjob instead of scraping the whole site. Personally I'd like to see a way to export rack elevations as well-formatted PDFs - one or more pages per location. In the past I've resorted to taking screengrabs of a few racks at a time, and pasting them together manually. |
Beta Was this translation helpful? Give feedback.
-
Any advice for regularly creating an offline copy for quick emergency access of a NetBox instance? I set up a cronjob to dump an offline copy daily via httrack, but httrack is simply too slow to get it done within 24 hours. I tried to use httrack's caching option, but neither C1 nor C2 works, i.e. the original dump is never updated with any of the live changes. It's tiresome to troubleshoot since the crawl takes such an awfully long time.
I guess an obvious alternative would be a direct database dump + the config file, then popping it into a clean installation. Not as quick as just having a bunch of HTMLs available for offline use, though...
Beta Was this translation helpful? Give feedback.
All reactions