任何人都知道使用JavaScript获取网站中所有网址的方法吗?
Any one knows a way to get all the URLs in a website using JavaScript?
我只需要以相同域名开头的链接。无需考虑其他链接。
I only need the links starting with the same domain name.no need to consider other links.
推荐答案这将获得页面上所有相同的主机链接:
var urls = []; for(var i = document.links.length; i --> 0;) if(document.links[i].hostname === location.hostname) urls.push(document.links[i].href);如果通过 site ,则表示您希望以递归方式获取链接内的链接页面,这有点棘手。您必须将每个链接下载到新文档中(例如,在< iframe> 中),并且 onload 检查iframe自己的文档,以获取更多链接以添加到要获取的列表。您需要查找已经抓取的URL,以避免两次获取同一文档。它可能不会很快。
If by site you mean you want to recursively get the links inside linked pages, that's a bit trickier. You'd have to download each link into a new document (for example in an <iframe>), and the onload check the iframe's own document for more links to add to the list to fetch. You'd need to keep a lookup of what URLs you'd already spidered to avoid fetching the same document twice. It probably wouldn't be very fast.
更多推荐
如何使用JavaScript获取网站中的所有网址?
发布评论