hello guys i am installing squid as HTTP proxy for scraping urls from google with a tool i have configured every thing but when i am connecting to squid it is giving tcp 403 error i have ubuntu14.04 and i have not changed its default settings just change HTTP_access deny all to allow and all are same i want to use that proxy for scraping purpose what should be config file of it please help me out searching stuff from more then 2 days now.
http_port 3128
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
acl localnet src 10.0.0.0/8 # RFC 1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC 1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC 1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443 # https
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
http_access allow localnet
http_access allow localhost
http_access deny all
Error 403 means Forbidden Access.
Change last line in your question:
http_access deny all
to
http_access allow all
This is basically changing your policy from blacklist (deny all sites) to whitelist (allow access to all sites). So you should add your acl above it.