Download a working local copy of a webpage as a single html file

download website offline chrome
download a websites entire code, html, css and javascript files
download entire website chrome
download all files from a website chrome
browser homepage html file download
save webpage as html online
download html file from url
save website as static html

I followed the instructions provided in this previous post. I am able to download a working local copy of the webpage (e.g. wget -p -k https://shapeshed.com/unix-wget/) but I would like to integrate all the files (js, css and images e.g. using base64 encoding) into a single html file (or another convenient format). Would this be possible?

It certainly can be done. But you’ll have to do couple of simple things manually, since there are no available tools to automate some of the steps.

  1. Download the web page using Wget with all dependencies.
  2. Copy the contents of linked stylesheets and scripts to main HTML file.
  3. Convert images to Base64 data URIs contained in HTML and CSS, then insert them to main HTML file.
  4. Minify the edited HTML file.
  5. Convert HTML file to Base64 data URI.

Here is an example of a single-page application encoded to Base64 data URI created to demonstrate the concept (copy and paste below code to web browser address bar):

data:text/html;charset=utf-8;base64,PCFkb2N0eXBlIGh0bWw+DQo8aHRtbCBsYW5nPSJlbiI+DQoJPG1ldGEgY2hhcnNldD0idXRmLTgiPg0KCTx0aXRsZT5TaW5nbGUtcGFnZSBBcHBsaWNhdGlvbiBFeGFtcGxlPC90aXRsZT4NCgk8c3R5bGU+DQoJCS8qIENvZGUgZnJvbSBDU1MgZmlsZXMgZ29lcyBoZXJlLiAqLw0KCQlib2R5IHsNCgkJCWZvbnQtZmFtaWx5OiBzYW5zLXNlcmlmOw0KCQl9DQoJCWJ1dHRvbiB7DQoJCQlkaXNwbGF5OiBibG9jaw0KCQl9DQoJPC9zdHlsZT4NCgk8c2NyaXB0Pg0KCQkvLyBDb2RlIGZyb20gLmpzIGZpbGVzIGdvZXMgaGVyZS4gDQoJCWZ1bmN0aW9uIGNoYW5nZVBhcmFncmFwaCgpIHsNCgkJICAgIGRvY3VtZW50LmdldEVsZW1lbnRzQnlUYWdOYW1lKCJwIilbMF0uaW5uZXJIVE1MID0gIkNvbnRlbnQgb2YgcGFyYWdyYXBoIGNoYW5nZWQuIjsNCgkJfQ0KCTwvc2NyaXB0Pg0KCTxib2R5Pg0KCQk8aW1nIHNyYz0iZGF0YTppbWFnZS9wbmc7YmFzZTY0LGlWQk9SdzBLR2dvQUFBQU5TVWhFVWdBQUFVQUFBQUR3QkFNQUFBQ0RBNkJZQUFBQU1GQk1WRVZVVmx1T2o1TC8vLzlrWm1xbXA2bUJnb1dhbTUyeHNyTnpkSGk4dkw3dDdlNzI5dmJHeDhqazVPVFEwZExhMnR2SHNtSDhBQUFDSjBsRVFWUjRBZXpCZ1FBQUFBQ0FvUDJwRjZrQ0FBQUFBQUFBQUFBQUFBQUFBQUFBWUExdElLU2twRERxUUdMQXFBTkhIY2dzSWd3a3d4SUJ6SllCaEJSaEdJYmZiWGZiMWUzcU5vRUU5NVN1bTJuM1Z1SndNSHNRa0FGUVpBVUF4bDA2UU9zRXVNaENDTWNRQVRFWEJhaURBOGdFSUpJQXNKYUFNdmsrVGdrQTVuL2cvN3p2NE9HYitZMmN4djdqVkVaMzRLZG5kNStrTlFudXd1b2NNbDJCOTVZZUZoRHZTVHFmRTAwdldhV3RBcUtrTnNHcndFWUw0S1BrSjNFcW5WanNndTBTWURTdVM5Qk1lQUN3WnFGenJBN0dyZ2x1NHl6cUVuUnlnSkdVdzlzU050ekt5YlNFelNXczF5VzR1WjhEcDY4QXRlR1dXaEJaTVp6TWdhd0J3M0d6SkI3WEpQaFoyN0N1aGd0VzFVSXFRVXY0WXFwa1BiZ21IVUJTazJDaUh0ejA3Y294T1JVdzlTbTdBQXVwRHkvcXVtYlVzY20xcEdkSHZ3RUVTRlpuNTNCZ0VZTGdJUTVOd0o4aHV4MlNZTHZBUVlFS1hvVG81YVQ4ZjhXZkJrWWFnT0FCTEh4U0RvbFVRcllDMytUVUwrZ3JWYk1BZlljM1Z2ZzFjeXoxcWlvTFEvQ0RuZ042QlBGcGVYWlJ6NXB6U0FJUVhBRytBcWlQVVVCbXhYQUprUUlRN0dEa1o5OXp2UFBQejhKYUNJSTZBYTc3ZEI5NDdlOWt0d1NJVjRNUWJPV01VcDkwci9veGRrRjFjb2oyRkFiZHdWaC9zUlZiZUhreVUyQThyYXBVV3NKVVliSUQ3MllQSVZhZzlNRzVvVUJwbGppSlFtVUw0NmZDNWM1UjlldFBlM0FnQUFBQWdBQm83UEZYR0tCcUFBQUFBQUFBQUFBQUFBQUFBQUFBQUxnTmtYVy9TUloxSldBQUFBQUFTVVZPUks1Q1lJST0iIGFsdD0iIj4NCgkJPGgxPlNpbmdsZS1wYWdlIEFwcGxpY2F0aW9uIEV4YW1wbGU8L2gxPg0KCQk8cD5UaGlzIGlzIGFuIGV4YW1wbGUgb2YgYSB3ZWIgYXBwIHRoYXQgaW50ZWdyYXRlcyBIVE1MLCBDU1MsIEphdmFTY3JpcHQsIGFuZCBhbiBpbWFnZSBpbnRvIG9uZSAuaHRtbCBmaWxlIHRoYXQgaXMgZW5jb2RlZCB0byBCYXNlNjQuPC9wPg0KCQk8YnV0dG9uIHR5cGU9ImJ1dHRvbiIgb25jbGljaz0iY2hhbmdlUGFyYWdyYXBoKCkiPkNoYW5nZSBQYXJhZ3JhcGg8L2J1dHRvbj4NCgk8L2JvZHk+DQo8L2h0bWw+

How to Save Web Pages for Offline Access Later, How do I save a webpage as a HTML file? I would like to download a local copy of a web page and get all of the css, images, javascript, etc. In previous discussions (e.g. here and here, both of which are more than two years old), two suggestions are generally put forward: wget -p and httrack. However, these suggestions both fail.

Try using HTTrack

It is very efficient and easy to use website copier. All you have to do is paste the link of the website you want to make a local copy of

Follow these steps as you want everything in single page

  1. Minify all the stylesheets and put them in <style> in your main HTML page use CSS minifier
  2. Minify all the scripts and put them inside <script> in the same file. Use JavaScript Minifier
  3. To deal with images use spites

6 Ways to Download and Read Websites Offline, and a folder full of images and other data contained within-don't delete this. SingleFile is a cross-browser open source extension for Firefox, Google Chrome and Opera to save any webpage you come across as a single HTML file. All modern web browsers come with options to save webpages; all you have to do is press Ctrl-S to save the webpage to the local system.

Another solution would be to use a web proxy with a custom extension in order to store the sources, cf. https://github.com/SommerEngineering/WebProxy

This GitHub project is a simple web proxy by me, written in Go. Inside the Main.go line 71 and beyond will copy any data from the original site to your browser.

In your case, you would add a query if the data is already stored or not. If so, load from disk and send it to your browser. If not, load it from the source and store it to the disk.

Your condition of using a singe-file storage would not be an issue: Go can read and write e.g. ZIP files, cf. https://golang.org/pkg/archive/zip/. If you need these web site dumps immediately, a bit of code is needed to follow all links in order to store anything now.

Therefore, this answer is not a ready-to-go solution to your question. Rather, it would need to code a little bit. Go code could be compiled to any platform (x86, ARM, PPC) and operating system (Linux, macOS, Windows).

Hope, this answer gives an option for you.

Download a working local copy of a webpage as a single html file , and tap on the main menu button at the top-right corner. Here tap on the “Download” icon and the page will be downloaded to your device. Afterall, SingleFile is the easiest and most convenient solution to saving any web page as a single HTML file while making the file operable in any browser. Since SingleFiles is a chrome extension, and Chrome is a cross-platform browser, you can use this extension on Mac, Linux and Chrome OS too. Author: Jaber Al Nahian

Save Page WE, It certainly can be done. But you'll have to do couple of simple things manually, since there are no available tools to automate some of the steps  To use cliget, visit a page or file you wish to download and right-click. A context menu appears called cliget, and there are options to copy to wget and copy to curl. Click the copy to wget option, open a terminal window, then right-click and choose paste. The appropriate wget command is pasted into the window.

SingleFile, Save a complete web page (as curently displayed) as a single HTML file that To download to your desktop sign into Chrome and enable sync or send in Version 20.0 which prevented duplicate CSS images being merged. HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure.

Quick tip: Save webpage with all images including stylesheet , Save a complete page into a single HTML file. To download to your desktop sign into Chrome and enable sync or send yourself a reminder ». Five ways to save a Web page. Internet Explorer, Firefox, and Google Chrome make it easy to save a Web page as an HTML file for viewing offline, but that is far from your only option when you want

Comments
  • Thank you for the answer, however I'm an looking for an automated way to achieve this, even if it requieres using several applications/commands. I'll wait for another anwser.
  • Hey Mat, I answered your original inquiry. Nowhere in your question have you emphasized that you were looking for an automated way to perform this action.
  • HTTrack runs as a Windows cmd.exe command, but also has a GUI version for Windows. Very handy.