Hey Guys,
Has any one known about ways that Google has crawled and indexed links on a webpage?
which tool or software Google is using for this task?
Here is the best explanation I think you will find about how Google crawls and indexes web pages.
Here is another good explanation.
Basically Google uses computers to go look for new web pages and to check for any changes to existing web pages. The computers go fetch the documents and then analyze them.
When Google sees a link to a URL that it didn't know about before, then it schedules a computer (Googlebot) to go fetch that new page.
So Google discovers new web pages when a human submits a new web page to them or when they discover a new URL on a web page they just fetched.