Is there a tool that will crawl my site looking for pages with "insecure" warnings?

I’ve just added SSL to one of my client’s sites and for various reasons, some of the site pages are displaying “insecure” warnings when users browse them.

This is a community site with well over 1000 pages so I’m looking for a tool that can crawl the site and let me know which pages are problematic so I can focus and fix.

Does anyone know of a tool that can crawl a site and generate a list of pages that contain insecure items?

Thanks,
Kevin

It could be a browser issue? Have you tried restarting, or clearing all of your cache?

There’s some ideas here

Maybe I need to clarify.

I’m a webmaster.
Because of a new security policy, all 5,000 pages on my site are protected by SSL.
Some pages have insecure content because of code that references http instead of https.
This code, for example <img src=“http://example.com/foo.png” /> triggers a security warning in my user’s browsers. It throws the security warning because it’s pulling in an image that is not protected by https. If the code is changed to
<img src=“https://example.com/foo.png” /> the security warnings go away.

I need a way to scan / crawl the 5,000 web pages to discover which ones need fixing.

Thanks,
Kevin

Sorry if I’m being simplistic. Presumably, your 5000 pages are backed up locally, wouldn’t a simple page content search for ‘http://’ find what you are looking for? If they are not backed up locally and you have direct access to the server, you could run the search there. Also, I seem to remember reading somewhere that if you have SSH access to the server you can run a file content search remotely.

This might also be of interest to you. According to the blurb, remote file contents can be searched. Never tried it though.

Just grep out all protocol specific urls and go to protocol agnostic uris – eg http://example.com becomes //example.com.

Thanks for all the suggestions!
I do have access to the server (and the database where the content is stored).
So maybe I can search for all occurrences of http:// and replace with //
I’ll give it a try.
Thanks again.

k,

First, the responses saying that visitor browsers are giving warnings is that you’re mixing http with https requests. That is a function of the browsers and is both common and correct.

Second, my host(s) have implemented a daily maldet scan which does scan for malware and can detect malicious code which may be placed on your site.

Third, once maldet is detecting no malware, then you can implement a hash validation of your files to report to you which files have been added, which have been deleted and which have been altered. As the webmaster, you can then compare those with your activities and investigate any anomalies.

Regards,

DK