HttpWebRequest Proxies in C# in 2024

Jan 9, 2024 ยท 4 min read

The easiest way to direct HttpWebRequest traffic through a proxy is via the WebProxy class.

It allows specifying proxy server details that get applied to outbound requests when configured.

Creating a WebProxy

Let's create a basic instance:

var proxy = new WebProxy("104.236.198.185:8080");

Here we pass the proxy's IP + port combo into the constructor. This creates a definition HttpWebRequest can use.

Assigning to HttpWebRequest

We can make our request flow through the proxy by attaching it to the request's Proxy property:

request.Proxy = proxy;

Now when we call GetResponse(), it will route the request via the proxy instead of connecting directly.

The full flow looks like:

var proxy = new WebProxy("104.236.198.185:8080");

var request = WebRequest.Create("<https://www.example.org/>") as HttpWebRequest;

request.Proxy = proxy;

using var response = request.GetResponse() as HttpWebResponse;

response.Close();

And that's the basic gist of applying a proxy!

The target site will see our request coming from the proxy's IP instead. We're successfully obscured and ready to start scraping anonymously.

A few gotchas:

  • Always set the proxy before calling GetResponse(). If you flip the order, HttpWebRequest will throw an exception.
  • I recommend using proxy provider services instead of random free public proxies. They rarely work reliably at scale for sustained scraping, and most code samples showing them give a false sense of effectiveness in my experience.
  • Alright, time to level up!

    Proxy Authentication

    Many enterprise-grade proxies protect access using usernames and passwords for added security.

    Here's how to plug them in...

    The WebProxy class exposes a Credentials property that expects a NetworkCredential object:

    var credentials = new NetworkCredential("my_username", "p@ssw0rd!");
    
    var proxy = new WebProxy("104.236.198.185:8080");
    
    proxy.Credentials = credentials;
    

    We pass the username + password combo when creating NetworkCredential. This gets assigned to the proxy instance globally such that any requests flowing through it will automatically get authenticated.

    Now when the proxy server receives traffic from our scraper, it can validate access permissions and approve the connection.

    Pretty straightforward! With this knowledge you can work with just about any secure proxy.

    Common errors

    Two scenarios that often trip people up:

    1. Incorrect credentials

    You'll see errors like 407 Proxy Authentication Required if the username or password is wrong. Triple check they 100% match what your provider issued if running into authorization problems.

    2. Credentials without proxy

    Don't specify credentials without configuring the proxy itself! A common mistake is setting NetworkCredential on a raw request or handler. This fails because there needs to be a WebProxy instance to attach them to.

    With authentication basics out of the way, let's discuss...

    Default System Proxy Settings

    Sometimes you want to reuse existing proxy options defined externally for simplicity.

    Rather than manually creating WebProxy, .NET allows leveraging system defaults:

    // Fetch default proxy info from app/system config
    var defaultProxy = WebRequest.DefaultWebProxy;
    
    request.Proxy = defaultProxy;
    

    Here we simply grab whatever proxy is already configured globally and attach it to the request object.

    There's a couple ways system defaults get established:

    1. App config files

    In app.config or web.config, you can specify proxy details like:

    <!-- Set global default proxy -->
    
    <system.net>
      <defaultProxy>
        <proxy usesystemdefault="true"
              proxyaddress="<http://192.168.1.10:3128>"
              bypassonlocal="true" />
      </defaultProxy>
    </system.net>
    

    2. Machine-wide settings

    System proxy configs from IE Browser or Windows settings can also define defaults.

    So this technique makes it easy to inherit already configured proxies without needing to redefine them manually in code.

    Now that we've covered basic setup, let's look at...

    Making Requests Via Proxy

    So far we focused solely on configuring proxies, but not much on practical usage yet.

    While proxy assignments themselves are straightforward, how scrapes function through them involves some additional considerations.

    Let's walk through a more complete example:

    // proxy instance with auth
    var proxy = new WebProxy("104.236.198.185:8080")
    {
      Credentials = new NetworkCredential("my_user", "p@ss")
    };
    
    // building request
    var request = WebRequest.Create("<https://www.example.org>") as HttpWebRequest;
    request.Proxy = proxy;
    
    // additional headers
    request.UserAgent = "MyApp v1.0";
    request.Referer = "<https://www.referringsite.com>";
    
    // fetching response
    using var response = request.GetResponse() as HttpWebResponse;
    
    // processing response
    using var reader = new StreamReader(response.GetResponseStream());
    
    var contents = reader.ReadToEnd();
    
    // handle output
    SaveToDatabase(contents);
    
    response.Close();
    

    Browse by tags:

    Browse by language:

    The easiest way to do Web Scraping

    Get HTML from any page with a simple API call. We handle proxy rotation, browser identities, automatic retries, CAPTCHAs, JavaScript rendering, etc automatically for you


    Try ProxiesAPI for free

    curl "http://api.proxiesapi.com/?key=API_KEY&url=https://example.com"

    <!doctype html>
    <html>
    <head>
        <title>Example Domain</title>
        <meta charset="utf-8" />
        <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1" />
    ...

    X

    Don't leave just yet!

    Enter your email below to claim your free API key: