[Resolved] Squid http and https proxy ACL Whitelisting

Started by furest_, September 08, 2021, 10:10:45 PM

Previous topic - Next topic
September 08, 2021, 10:10:45 PM Last Edit: September 10, 2021, 09:47:08 PM by furest_
Hi

I'm trying to setup squid as a non-transparent proxy for both HTTP and HTTPS trafic in order to blacklist all web trafic except for a handful of urls/domains.

While proxying itself is working fine I'm having troubles configuring ACL Whiltelisting for HTTPS. It seems to work fine for HTTP though.

Test Setup : Opnsense 21.7.2_1 in a KVM virtual machine. Client is a Rocky 8 VM behind the Opnsense. Tests are done with curl (curl http://google.com -x 192.168.1.1:3128)

Squid configuration :
- General forward settings :
   - Proxy interfaces : LAN
   - Transparent proxy disabled
   - SSL Inspection enabled.
   - CA to use : locally generated CA specifically for squid. It is installed as a trusted certificate on the test client behind opnsense.
- Access Control List :
   - Whitelist : this is the actual core of my issue. Some values do not seem to work as excpected. See below
   - Blacklist : ".[a-zA-Z]+" (basically block everything)

The rest of the config is untouched.

Basically if I set the whitelist to "google.com" or "google", I can curl to both http://google.com and https://google.com. This is an expected result
If I set the whitelist to "http://google.com" I can only query http://google.com. https://google.com gives me the "Access Denied" page. This is an expected result.

However when the whitelist is set to "https?://google.com", I can query http://google.com successfuly but querying https://google.com gives me the "Access Denied" page. The expected result would be to successfuly query https://google.com

Also when setting the whitelist to "https://google.com" I cannot query anything. The expected result would be to only be able to successfuly query https://google.com

It almost seems like when checking the ACL for HTTPS trafic, squid only matches the regex against the domain name of the website and not the full accessed URL.

Am I missing something there?





Apparently the issue was the blacklist.
By using "https?:\/\/" as the blacklist it works as intended : all trafic is denied except for what is whitelisted.

I cannot really explain why this was not working though... If someone has the correct explanation as to why ".[a-zA-Z]+" was blocking trafic in https but not http I'd be more than happy to hear it.