I'm working on a networking project for my internship, where the objective is to block access to websites and Google search queries that contain specific keywords such as "alcohol", "porn", or "gambling", even when accessed over HTTPS. I have successfully configured a Squid proxy that filters HTTP traffic based on domains and ACLs, but it does not inspect or block HTTPS traffic due to the limitations imposed by SSL/TLS encryption. To overcome this, I tried enabling SSL Bump in Squid for man-in-the-middle-style HTTPS decryption, and then integrated e2guardian for deeper content and keyword filtering. However, I'm encountering issues during Docker deployment of e2guardian. The container fails to start, returning the error: "Config problem; check allowed values for proxyexchange", despite trying both default and sample configurations. I've double-checked the config paths, permissions, CA cert setup, and rebuilt the container using different base images and OpenSSL versions, but the problem persists. My goal is to inspect and block HTTPS traffic based on specific keywords present in the URL or page content, ideally using an open-source toolchain like Squid and e2guardian. I also need guidance on setting up and trusting the generated CA certificate on client devices to prevent browser warnings. Any help with a working configuration, Dockerfile, or troubleshooting tips—especially around the proxyexchange setting—would be incredibly helpful, as I’ve been stuck on this for several days and need a reliable solution to move forward with the project.
↧
How to block web searches based on keywords and filtering https? [closed]
↧