No software can test your website and declare it fully ADA, or more specifically WCAG, compliant because some of the checks need to be made by a human or are subjective.For a strict validation, a page can be passed to the w3c validator with a single click.A number of tests are made to the html of each page as they are scanned, with a focus on well-formedness and issues that may have consequences. This allows you to provide reports to anyone within an organisation, or (with customisation of logo/header) provide services to your customers. With the option switched on in Preferences, simply transfer the files to a web server, anyone with a link can view / download. Scrutiny's data is optionally autosaved or manually saved and reloaded - carry on working on broken links or other issues another day without re-scanning.Build your own custom reports using external tools - the exported csv files comply with the requirements of Google Data Studio.Piechart (for links) and radar chart (for SEO) included in the summary report.Full report contains summary report plus csv's for the main tables.Customizable, branded summary report can be generated after a scheduled or adhoc scan.Summary report visible at end of scan, containing stats about of bad links, SEO problems and spelling / grammar issues.choose the language used by the spell checker on a per-site basis.Step through those one by one and see suggestions.checks for spelling issues on your pages as it scans.displays stats for each page such as word count, link count, content size, image count, image weight.list deep content (greater than X links from home page).list pages with mixed content ( resources within page).list pages with title too long (new since v5.6.).list pages with description too long / too short.list pages with possible duplicates (same content, different url).list pages with missing SEO parameters (title, description etc).keyword / phrase analysis - see the count for any word / phrase in url / title / description / content.Double click to see an analysis for that page, checking up to 4 word terms. keyword density alerts - see pages with any keyword(s) occurring in the content above a particular threshold ('stuffing').display SEO parameters such as url, title, description, main headings, noindex/nofollow.See a list of pages which reference insecure / mixed content (images or other files).See a list of pages which contain links to the version of the site.Check for insecure content and links to old http site (migration to site):.use a context menu for options such as visit url, copy url, highlight link on page, re-check link, mark as fixed.filter results to show bad links only, internal links, external links, images and more.limit your crawl using blacklisting or whitelisting on url and even terms within the content.support for attempting to spot 'soft 404s' - this is where 200 is returned despite intended page not being found.status for each link clearly displayed (eg '200 no error').Test the links within a list of links, an xml sitemap, a local pdf document or Word (.docx) document.This isn't as good as scanning via a server, because you're testing whether files exist rather than sending a http request and receiving a server response code. option to scan pdf or docx documents to find links.can optionally render page, making it possible to scan sites that require clientside js rendering.scan websites which require authentication (signing-in / logging-in).better protection when disc space is low, scan should stop before catastrophe happens, warn and give an option to pause or continue.Full code search or just the visible text. search your site for pages containing specific text, or not containing specific text, a single term or multiple terms.many options for doing other things while crawling eg archiving, spell-checking.many options for tailoring the crawl - ignoring querystrings or not, ignoring trailing slashes or not, tolerance to coding errors such as mismatched quotes etc.many options for limiting the crawl - by number of levels or links, by blacklisting or whitelisting.Tens of thousands of pages containing hundreds of thousands of links, drilling down as many levels as you like. can store a huge amount of information.handles large sites without slowing down.If your server can cope, turn up the number of threads and see how fast. fast, efficient and native to MacOS (ie not Java, not an iOS app running under Catalyst) which makes for efficiency and security.Autosave feature saves data for every scan, giving you easy access to results for any site you've previously scanned.Please bear in mind that Scrutiny's licence is a one-off purchase not an annual subscription. This page is for you if you're evaluating Scrutiny or comparing it with a competitor.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |