Sorry, I didn't know you had a Mac. Here's a more detailed overview of the process:
http://www.cfi.au.dk/publikationer/archiving/webhttrackNote that it's a bit dated, from 2004. I'm not a Mac user, so I couldn't tell you if the tools and dependencies he lists are part of the default configuration on a Mac or not. Also note that for a Mac, HTTrack will be a terminal program, so you'll need to learn the command line parameters to make it work the way you want.
There are a few other free site rippers for the Mac, none of them as feature-rich as HTTrack, but they may be enough for your needs:
Site SuckerProxy Offline Browser - this is an odd one. It works by archiving pages as you visit them, so you'll have to trawl the site with your browser to get a complete copy.
CocoaWGet - a GUI for the venerable WGet program. For a screenshot of the English version, go
here and click the tiny monitor icon.
Web GrabberHere's some payware programs, in case you want to either buy or just try to snag the Chinese site during the demo period:
http://www.limit-point.com/BlueCrab/BlueCrab.htmlhttp://www.pagesucker.com/http://www.maximumsoft.com/products/wc_mac/overview.htmlhttp://www.chaoticsoftware.com/ProductPages/WebDevil.htmlhttp://www.maxprog.com/WebDumper.html