What you are trying to create is actually a download manager. It is easier to create a simple download manager in java but quite tedious to create a full fledged one.
The idea behind it is simple. Say you have a webpage with url www.example.com/index.html. to download just index.html is easy. But to download all pages of a domain or website. You have to download index.html. Then parse index.html for links that are inside domain (ie within www.example.com).You need to download all the links, and then go through all pages downloaded from links and find more links. This goes on till you have parsed all links once. So essentially you would need to read a webpage,grab links and then download those links.You need to search info on web crawler,web page parsing etc.
If you are just trying to download a website please try softwares like flashget,internet download manager etc. There are some opensource once so you could get source as well.
Please go through the links below for more info
http://www.9code.in/java-download-manager-with-full-source-code/
http://www.javaworld.com/article/2076095/core-java/download-a-website-for-offline-browsing.html
http://www.programcreek.com/2012/12/how-to-make-a-web-crawler-using-java/
How to get a web page’s source code from Java
2
solved Write a program to save web-page to a computer [closed]