Download a working local copy of a webpage as a single html file

download website offline chrome
download a websites entire code, html, css and javascript files
download entire website chrome
download all files from a website chrome
browser homepage html file download
save webpage as html online
download html file from url
save website as static html

I followed the instructions provided in this previous post. I am able to download a working local copy of the webpage (e.g. wget -p -k but I would like to integrate all the files (js, css and images e.g. using base64 encoding) into a single html file (or another convenient format). Would this be possible?

It certainly can be done. But you’ll have to do couple of simple things manually, since there are no available tools to automate some of the steps.

  1. Download the web page using Wget with all dependencies.
  2. Copy the contents of linked stylesheets and scripts to main HTML file.
  3. Convert images to Base64 data URIs contained in HTML and CSS, then insert them to main HTML file.
  4. Minify the edited HTML file.
  5. Convert HTML file to Base64 data URI.

Here is an example of a single-page application encoded to Base64 data URI created to demonstrate the concept (copy and paste below code to web browser address bar):


How to Save Web Pages for Offline Access Later, How do I save a webpage as a HTML file? I would like to download a local copy of a web page and get all of the css, images, javascript, etc. In previous discussions (e.g. here and here, both of which are more than two years old), two suggestions are generally put forward: wget -p and httrack. However, these suggestions both fail.

Try using HTTrack

It is very efficient and easy to use website copier. All you have to do is paste the link of the website you want to make a local copy of

Follow these steps as you want everything in single page

  1. Minify all the stylesheets and put them in <style> in your main HTML page use CSS minifier
  2. Minify all the scripts and put them inside <script> in the same file. Use JavaScript Minifier
  3. To deal with images use spites

6 Ways to Download and Read Websites Offline, and a folder full of images and other data contained within-don't delete this. SingleFile is a cross-browser open source extension for Firefox, Google Chrome and Opera to save any webpage you come across as a single HTML file. All modern web browsers come with options to save webpages; all you have to do is press Ctrl-S to save the webpage to the local system.

Another solution would be to use a web proxy with a custom extension in order to store the sources, cf.

This GitHub project is a simple web proxy by me, written in Go. Inside the Main.go line 71 and beyond will copy any data from the original site to your browser.

In your case, you would add a query if the data is already stored or not. If so, load from disk and send it to your browser. If not, load it from the source and store it to the disk.

Your condition of using a singe-file storage would not be an issue: Go can read and write e.g. ZIP files, cf. If you need these web site dumps immediately, a bit of code is needed to follow all links in order to store anything now.

Therefore, this answer is not a ready-to-go solution to your question. Rather, it would need to code a little bit. Go code could be compiled to any platform (x86, ARM, PPC) and operating system (Linux, macOS, Windows).

Hope, this answer gives an option for you.

Download a working local copy of a webpage as a single html file , and tap on the main menu button at the top-right corner. Here tap on the “Download” icon and the page will be downloaded to your device. Afterall, SingleFile is the easiest and most convenient solution to saving any web page as a single HTML file while making the file operable in any browser. Since SingleFiles is a chrome extension, and Chrome is a cross-platform browser, you can use this extension on Mac, Linux and Chrome OS too. Author: Jaber Al Nahian

Save Page WE, It certainly can be done. But you'll have to do couple of simple things manually, since there are no available tools to automate some of the steps  To use cliget, visit a page or file you wish to download and right-click. A context menu appears called cliget, and there are options to copy to wget and copy to curl. Click the copy to wget option, open a terminal window, then right-click and choose paste. The appropriate wget command is pasted into the window.

SingleFile, Save a complete web page (as curently displayed) as a single HTML file that To download to your desktop sign into Chrome and enable sync or send in Version 20.0 which prevented duplicate CSS images being merged. HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure.

Quick tip: Save webpage with all images including stylesheet , Save a complete page into a single HTML file. To download to your desktop sign into Chrome and enable sync or send yourself a reminder ». Five ways to save a Web page. Internet Explorer, Firefox, and Google Chrome make it easy to save a Web page as an HTML file for viewing offline, but that is far from your only option when you want

  • Thank you for the answer, however I'm an looking for an automated way to achieve this, even if it requieres using several applications/commands. I'll wait for another anwser.
  • Hey Mat, I answered your original inquiry. Nowhere in your question have you emphasized that you were looking for an automated way to perform this action.
  • HTTrack runs as a Windows cmd.exe command, but also has a GUI version for Windows. Very handy.