Additional Information

Copy whole websites or sections locally for offline browsing

Latest Version Cyotek WebCopy 1.9.1 Build 872
Requirements

Windows Vista/Windows 7/Windows 8/Windows 10

Updated September 06, 2023
Author Cyotek Ltd.
Category File Transfer and Networking
License Freeware
Language English
Download 56

Overview

Cyotek WebCopy is a free tool for copying full or partial websites locally onto your harddisk for offline viewing. It will scan the specified website and download its content onto your harddisk. Links to resources such as style-sheets, images, and other pages on the website will automatically be remapped to match the local path. Using its extensive configuration you can define which parts of a website will be copied and how. This software may be used free of charge, but as with all free software, there are costs involved to develop and maintain.

What can WebCopy do?

The Web Copy Tool will examine the HTML mark-up of a website and attempt to discover all linked resources such as other pages, images, videos, file downloads - anything and everything. It will download all of these resources, and continue to search for more. In this manner, WebcCopy can "crawl" an entire website and download everything it sees in an effort to create a reasonable facsimile of the source website.

What can WebCopy not do?

It does not include a virtual DOM or any form of JavaScript parsing. If a website makes heavy use of JavaScript to operate, it is unlikely It will be able to make a true copy if it is unable to discover all of the websites due to JavaScript being used to dynamically generate links.

It does not download the raw source code of a web site, it can only download what the HTTP server returns. While it will do its best to create an offline copy of a website, advanced data-driven websites may not work as expected once they have been copied.

Features and Highlights

Rules
Rules control the scan behavior, for example excluding a section of the website. Additional options are also available such as downloading a URL to include in the copy, but not crawling it.

Forms and Passwords
Before analyzing a website, you can optionally post one or more forms, for example to login to an administration area. HTTP 401 challenge authentication is also supported, so if your website contains protected areas, you can either pre-define user names and passwords or be automatically prompted for credentials while scanning.

Viewing links
After you have analyzed your website, the Link Map Viewer allows you to view all the links found in your website, both internal and external. Filtering allows you to easily view the different links found.

Configurable
There are many settings you can make to configure how your website will be crawled, in addition to rules and forms mentioned above, you can also configure domain aliases, user agent strings, default documents and more.

Reports
After scanning a website, you can view lists of pages, errors, missing pages, media resources, and more.

Regular Expressions
Several configuration options make use of regular expressions. The built-in editor allows you to easily test expressions.

Website Diagram
View and customize a visual diagram of your website, which can also be exported to an image.

Note: Requires .NET Framework.

Previous versions More »

1.9.1 872 September 06, 2023 4.04 MB 1.9.0 822 June 24, 2023 5.19 MB 1.8.3 768 June 22, 2023 5.19 MB 1.8.2 744 June 21, 2023 5.19 MB