1

I have a long wifi free journey ahead of me and I would like to continue to productively review bugs even without the ability to change them.

Is there an easy way to cache all the pages generated from the query below?

https://bugzilla.gnome.org/buglist.cgi?product=banshee&bug_status=UNCONFIRMED&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&bug_severity=blocker&bug_severity=critical&bug_severity=major&bug_severity=normal&bug_severity=minor&bug_severity=trivial&gnome_version=

  • 1
    I'm nearly sure it is asked specifically to NOT do this on live.gnome.org. I may be wrong but probably you should doublecheck. – Maja Piechotka Aug 26 '10 at 11:07
  • to late I was already banned.. for attempting to increase my productivity. I even observed every kindness recommended by wget to specifically not put an undue load on the server.

    Regardless putting such a warning on a separate website just won't cut it.

    There should be a way to do this, it is simply not realistic to expect everyone to be online at all times. All I want to do is continue to be able to review the some 600 bugs left in Banshee while I am travelling.

    I guess those bug reviews will be left to other people again since I am now entirely without tools to do so.

    –  Aug 27 '10 at 11:01

2 Answers2

2

You can use the command wget with the option --recursive. But be aware that it could download a lot of sites :) . To limit the result you could also use the the argument --domains=domain-list if you only want sites from a specific domain (or several domains separated by commas) and the --level=depth-argument to specify the level of recursion.

So, your command could look something like this:

wget --recursive --domains=bugzilla.gnome.org --level=5  https://bugzilla.gnom...

But there's a lot more options to wget. Check out the man page for wget.

  • The exact command upon experimentation, with limiting enabled since we are ignoring robots.txt

    wget -e robots=off --limit-rate=80k --recursive --domains=bugzilla.gnome.org --level=5 https://bugzilla.gnome.org/buglist.cgi?product=banshee&bug_status=UNCONFIRMED&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&bug_severity=blocker&bug_severity=critical&bug_severity=major&bug_severity=normal&bug_severity=minor&bug_severity=trivial&gnome_version=

    –  Aug 25 '10 at 21:40
  • Doing so gets your IP banned, just for reference. –  Aug 27 '10 at 11:02
  • Sorry to hear you got banned. Maybe --wait 1 instead of --limit-rate=80k would've worked better. – The Silent Boatman Sep 05 '10 at 21:31
1

Checkout App::SD on your local CPAN.

SD is a peer to peer bug tracking system which we've built to share with just about anything. Contributors have helped us write adaptors for RT, Hiveminder, Trac, GitHub, Google Code, and Redmine. You can extend SD to sync to other bug tracking tools by writing a small bit of code. If you can help make SD work better with your bug tracker, drop us a line.

update I just realized that list doesn't yet include Bugzilla... sorry. but I'm gonna leave the answer up in the event that it does in the future, or someone wants to use if for offline caching of another bugtracker, or maybe you want to add bugzilla support.

xenoterracide
  • 59,188
  • 74
  • 187
  • 252