Getting all the Links
I have in the past spent a lot of time trying to get a simple site map, even just a single page. Well, I stumbled across a little JS and it does what I need it to do.
var x = document.querySelectorAll("a");
var myarray = []
for (var i=0; i<x.length; i++){
var nametext = x[i].textContent;
var cleantext = nametext.replace(/\s+/g, ' ').trim();
var cleanlink = x[i].href;
myarray.push([cleantext,cleanlink]);
};
function make_table() {
var table = '<table><thead><th>Name</th><th>Links</th></thead><tbody>';
for (var i=0; i<myarray.length; i++) {
table += '<tr><td>'+ myarray[i][0] + '</td><td>'+myarray[i][1]+'</td></tr>';
};
var w = window.open("");
w.document.write(table);
}
make_table()
Action the above code in the developer console of a browser on the page you want all the links for and a new tab is generated with a list of the links on that page. Not only will it pull out the links for the visible links but if there are buttons with a menu of links these will be extracted as well.
All in a simple table of columns Name & Links
This will certainly help me build some automated navigation to the correct pages without the need to rely on a recorder like Selenium or Katalon and allow me to focus on clean code writing from scratch.
Hope this helps others