Hello,
I'm planing to create a prog that would allow to generate a list of links in a webpage.
I assume I would have to download the page and then parse the <a href=""></a> lines.
Is there a way to recursevely read all the links "on line" (I found nothing for this particular purpose)...
Thanks for any idea.
Pierre
How to get a list of links in a web page ?
Maybe the source code of Freak's IETool can give you a hint:
viewtopic.php?t=2698
It's an extension to the MS Internet Explorer which allows you to compile selected text of a HTML pages with PureBasic.
Next I'd check for is FloHimself's Regular Expression Library; ideal to parse for links with an expression like
Don't have a link for Flo's lib handy, but if you search for Regular Expression, you should come up with a result quickly.
viewtopic.php?t=2698
It's an extension to the MS Internet Explorer which allows you to compile selected text of a HTML pages with PureBasic.
Next I'd check for is FloHimself's Regular Expression Library; ideal to parse for links with an expression like
Code: Select all
<a href=".*</a>

