HTTP Anti-Virus Proxy
http://havp.hege.li/forum/

Havp + Squid -- Problems with virus in url
http://havp.hege.li/forum/viewtopic.php?f=3&t=314
Page 1 of 1

Author:  lmanrique [ 07 Dec 2007 14:39 ]
Post subject:  Havp + Squid -- Problems with virus in url

Hi, my name is Luis, I’m from Brazil and I work in a university, with some Windows workstations in the Lan. I found out about HAVP
reading na article in the brazilian edition of Linux Magazine, but I’m lay when it comes about proxy. I configured the HAVP, following the how-to, but there’s a lack of documentation in portuguese. So I got to the finally conclusion:

- Debian Etch running the Iptables with the nat rules and transparent Proxy as we see below:
echo 1 > /proc/sys/net/ipv4/ip_forward
iptables --append FORWARD --in-interface eth0 -j ACCEPT
iptables --table nat --append POSTROUTING --out-interface eth0 -j MASQUERADE
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 3128

- Squid 2.6 configuration so HAVP can work as a parent:
############################################
############################################
http_port 3128 transparent
visible_hostname vm1

cache_mem 64 MB
maximum_object_size_in_memory 64 KB
maximum_object_size 512 MB
minimum_object_size 0 KB
cache_swap_low 90
cache_swap_high 95
cache_dir ufs /var/spool/squid 2048 16 256
cache_access_log /var/log/squid/access.log
refresh_pattern ^ftp: 15 20% 2280
refresh_pattern ^gopher: 15 0% 2280
refresh_pattern . 15 20% 2280

#HAVP
cache_peer 127.0.0.1 parent 8080 0 no-query no-digest no-netdb-exchange default
cache_peer_access 127.0.0.1 allow all

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 901 # swat
acl Safe_ports port 1025-65535 # portas altas
acl purge method PURGE
acl CONNECT method CONNECT

http_access allow manager localhost
http_access deny manager
http_access allow purge localhost
http_access deny purge
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

acl redelocal src 10.180.0.0/24
http_access allow localhost
http_access allow redelocal

http_access deny all
##########################################
##########################################

HAVP file configuration – running with LibClamav:

#####
##### ClamAV Library Scanner (libclamav)
#####
ENABLECLAMLIB true
# HAVP uses libclamav hardcoded pattern directory, which usually is
# /usr/local/share/clamav. You only need to set CLAMDBDIR, if you are
# using non-default DatabaseDirectory setting in clamd.conf.
#
# Default: NONE
# CLAMDBDIR /path/to/directory
# Should we block broken executables?
#
# Default:
# CLAMBLOCKBROKEN false
# Should we block encrypted archives?
#
# Default:
# CLAMBLOCKENCRYPTED false
# Should we block files that go over maximum archive limits?
#
# Default:
# CLAMBLOCKMAX false
# Scanning limits _inside_ archives (filesize = MB):
# Read clamd.conf for more info.
#
# Default:
# CLAMMAXFILES 1000
# CLAMMAXFILESIZE 10
# CLAMMAXRECURSION 8
# CLAMMAXRATIO 250
This part is the only one that has been changed.

With this Setup I configure my clients so they’ll be able to use my server as a gateway. Making a test in Eicar website, the anti-virus works, but when the virus is in URL, like this:
http://www.tpncs.com/NetEmpresa-3.3.25.exe ; the browser opens a window to save the file in the HD. When I run clamd in a command line, it detects the virus in the file thas has been saved. My doubt is: is Havp, in my configuration, scanning only cached files? How can I resolve the problem described above?
Another question, what is the difference in my configuration for a sandwich configuration?

PS: I ask about the difference because I’m new in this subject, I read about sandwich configuration, but I don’t understand the real difference between them.

Thank you and I really appreciate your colaboration. And thanks to havp development team.

Luis Manrique.

Author:  sebastian [ 09 Dec 2007 01:51 ]
Post subject: 

That file is pretty large so it may slip trough.

Try configure:

KEEPBACKBUFFER
MAXSCANSIZE
MAXDOWNLOADSIZE

to the same value. This will gurantee that all files is scanned completely before they are sent to the browser. But it may cause problems with very large downloads (> 50 mb), because the browser may timeout.

Also have a low TRICKLE value so the browser dosent timeout, and a low KEEPBACKTIME (but not too low) so the browser does not timeout.

The only reason of having a sandwich configuration, like squid - havp - squid is that the first squid should not cache anything but handle authentication if you want to password protect the internet for you LAN users.

But if you only want scanning and caching, squid - havp - squid is EXACTLY the same as havp - squid.

PS. When clicking your link, its detected as Trojan.Bancos-3784 DS.

Author:  hege [ 09 Dec 2007 11:19 ]
Post subject: 

sebastian wrote:
But if you only want scanning and caching, squid - havp - squid is EXACTLY the same as havp - squid.


Except that you lose flexibility without Squid, acls for whitelisting, routing etc. Also if users are forced to proxy HTTPS, it's more efficient through Squid than HAVP.

Author:  sebastian [ 09 Dec 2007 13:16 ]
Post subject: 

You can still use acls in the caching squid that is placed after havp.
And with HTTPS, its best to use Apache 2.2.6 to configure it as a proxy that eavesdrop on HTTPS traffic and sends it unencrypted through HAVP & squid and then reencrypting the traffic at the other end.

This means that the client will get invalid certificate, but the only thing needed to do is to import the root certificate into the browser.

(Maybe you can put the root in a EXE that forcefully installs the root certificate, that also calls a URL after installation. Then you have a captive portal that makes sure the user cannot visit any site before this "secret" URL is visited, and that means that the user has executed the EXE.)

Here is a apache configuration you can use to eavesdrop on SSL traffic:
Code:
<VirtualHost> #HTTPS traffic
    RewriteEngine on
    RewriteCond %{REQUEST_METHOD} !^(GET|POST)
    RewriteRule .* - [F]
    ServerAdmin root@localhost
    ErrorLog /var/log/httpd/error_log
    TransferLog /var/log/httpd/access_log
   SSLEngine on
   SSLProtocol all -SSLv2
   SSLCipherSuite ALL:!ADH:!EXPORT56:!eNULL:!SSLv2:RC4+RSA:+HIGH:+MEDIUM:+LOW:+EXP


   SSLCertificateFile /etc/httpd/server.crt #THIS is the certificate the user is going to get
   SSLCertificateKeyFile /etc/httpd/server.key #THIS is the private key
   SSLCertificateChainFile /etc/httpd/cacert.crt #THIS is the CA root,  also needs to be imported to your browser.
   SetEnv HOME /home/nobody

ProxyRemote * http://127.0.0.1:8080 #HAVP proxy
ProxyPreserveHost Off
RequestHeader unset xsslcatch
RequestHeader set xsslcatch ison #Tell the apache proxy on port 8445
#that the traffic was HTTPS before it was unencrypted
Header unset Via
Header unset X-Cache
Header unset Vary
</VirtualHost>

<VirtualHost> #HTTP traffic

    RewriteEngine on
    RewriteCond %{REQUEST_METHOD} !^(GET|POST)
    RewriteRule .* - [F]
    DocumentRoot /arpmessages
    ServerAdmin root@localhost
    ErrorLog /var/log/httpd/error_log
    TransferLog /var/log/httpd/access_log
ProxyRemote * http://127.0.0.1:8080 #HAVP proxy
ProxyPreserveHost Off
RequestHeader unset xsslcatch
RequestHeader set xsslcatch isoff #Tell the apache proxy on port 8445
#that the traffic was not encrypted, and it should NOT reencrypt the traffic.
Header unset Via
Header unset X-Cache
Header unset Vary

</VirtualHost>


#your SQUID proxy needs to use this as your last proxy: 127.0.0.1:8445
<VirtualHost>
ProxyRequests on

SSLProxyEngine on
SSLProxyMachineCertificateFile clientcerts.pem
ProxyVia block
ProxyPreserveHost Off
#ProxyMaxForwards -1 #Not implemented in Apache Yet. Will come in 2.2.7 version
DocumentRoot /home/httpd/html
ServerAdmin root@localhost
ErrorLog /var/log/httpd/error_log
TransferLog /var/log/httpd/access_log

SetOutputFilter INFLATE
Header unset Content-Encoding

<Proxy>
RequestHeader unset Via
RequestHeader unset X-Forwarded-For
RequestHeader unset Accept-Encoding
RequestHeader unset xsslcatch
RequestHeader unset cache-control
Header unset pragma
Header unset cache-control
Header unset expires
Header set Cache-Control public
Header set Expires "Thu, 31 Dec 2099 23:59:59 GMT"
RequestHeader unset cache-control
RewriteEngine On

RewriteCond %{HTTP:xsslcatch} ^ison$
RewriteRule ^proxy:http://(.*)$ proxy:https://$1
RewriteCond %{HTTP:xsslcatch} ^isoff$
RewriteRule ^proxy:http://(.*)$ proxy:http://$1
RewriteCond %{REQUEST_METHOD} !^(GET|POST)
RewriteRule .* - [F]
</Proxy>
</VirtualHost>


And then use the following IPtables rules:

Code:
        /sbin/iptables -t nat -A PREROUTING -p tcp -i ! eth3 --dport 443 -j REDIRECT --to-port 8443
        /sbin/iptables -t nat -A PREROUTING -p tcp -i ! eth3 --dport 80 -j REDIRECT --to-port 8444

and "eth3" is your internet interface.
You need some accept rules too.

Then configure transparent = off on both squid and HAVP. (Apache takes care of this bit)

Then you have a proxy chain that looks like this:
Apache - HAVP - Squid - Apache
And it will scan both HTTP and HTTPS traffic transparently.

Author:  hege [ 09 Dec 2007 14:45 ]
Post subject: 

sebastian wrote:
You can still use acls in the caching squid that is placed after havp.


You are missing the point, there are many things that you might want to do before you reach HAVP (or don't reach at all). ;)

Quote:
And with HTTPS, its best to use Apache 2.2.6 to configure it as a proxy that eavesdrop on HTTPS traffic and sends it unencrypted through HAVP & squid and then reencrypting the traffic at the other end.


This is way off-topic too. But props for a good example.

Page 1 of 1 All times are UTC + 2 hours [ DST ]
Powered by phpBB® Forum Software © phpBB Group
https://www.phpbb.com/