Automatyczne pobieranie kolejnych stron internetowych z podziałem na strony i wstawianie ich do bieżącej strony w celu nieskończonego przewijania. Obsługa tysięcy stron internetowych bez żadnej reguły.
Take it.
{ "name": "gog", "action": 1, "url": "^https://www\\.gog\\.com/", "pageElement": "paginated-products-grid.ng-star-inserted", "nextLinkByUrl": [ "(&page=(\\d+))?$", "&page={$2+1}" ], "pageAction": "[].forEach.call(eles, ele=>{[].forEach.call(ele.querySelectorAll('source[lazyload]'), source=>{source.setAttribute('srcset',source.getAttribute('lazyload'))})})" }
I remember asking about gog.com and you helped make the rule for it however seem to be having issues when narrowing down search results. for instance on gog.com going to store, on sale now, then narrowing search results to lets say 0.74 - 8.00 wpagetual doesnt work.
https://www.gog.com/en/games?priceRange=0.74%2C8&discounted=true
Seems anytime there's a filter narrowing results I have issues on this website.
As always thanks for your updates.