使用Java,如何从给定的网页中提取所有链接?
Using Java, how can I extract all the links from a given web page?
推荐答案将java文件下载为纯文本/ html通过 Jsoup 或 html cleaner 传递它们两者是相似的,可以是用于解析甚至格式错误的html 4.0语法然后你可以使用流行的HTML DOM解析方法,如getElementsByName(a)或jsoup,它甚至很酷你只需使用
download java file as plain text/html pass it through Jsoup or html cleaner both are similar and can be used to parse even malformed html 4.0 syntax and then you can use the popular HTML DOM parsing methods like getElementsByName("a") or in jsoup its even cool you can simply use
File input = new File("/tmp/input.html"); Document doc = Jsoup.parse(input, "UTF-8", "example/"); Elements links = doc.select("a[href]"); // a with href Elements pngs = doc.select("img[src$=.png]"); // img with src ending .png Element masthead = doc.select("div.masthead").first();找到所有链接,然后使用
and find all links and then get the detials using
String linkhref=links.attr("href");取自 jsoup/cookbook/extracting-data/selector-syntax
选择器的语法与 jQuery 如果你知道jQuery函数链接那么你肯定会喜欢它。
The selectors have same syntax as jQuery if you know jQuery function chaining then you will certainly love it.
编辑:如果你想要更多的教程,你可以试试这个由mkyong制作的。
In case you want more tutorials, you can try out this one made by mkyong.
www.mkyong/java/jsoup-html-parser-hello-world-examples/
更多推荐
从网页中提取链接
发布评论