
Google recently announced Puppeteer, a new tool to assist with Chrome browser automation.Ĭode examples are included so you can follow along. If your needs to download a file are more simplistic, you can probably use the other methods mentioned on this thread, or the linked thread.Automating browsers provide many benefits including faster execution of repetitive tasks, ability to parallelise workloads and improved test coverage for your website. The details of DownloadFileRequiringHeadersAndCookies are here. Var cookieContainer = new CookieContainer() ĬookieContainer.Add(new Cookie(cookie.Name, cookie.Value, cookie.Path, cookie.Domain)) Populate the Cookie Container like this: private CookieContainer BuildCookieContainer(IEnumerable cookies) NEED THIS TIMEOUT TO KEEP THE BROWSER OPEN WHILE THE FILE IS DOWNLOADING!Īwait page.WaitForTimeoutAsync(1000 * configs.DownloadDurationEstimateInSeconds) Var cookieContainer = BuildCookieContainer(pageCookies) Īwait DownloadFileRequiringHeadersAndCookies(getUrl, fullPath, cookieContainer, cancellationToken) Īwait page.ClickAsync("button") Var pageCookies = await page.GetCookiesAsync() Add the cookies to a container for the upcoming Download GET request If (contentType.Contains("application/vnd.ms-excel")) Handle the response with the Excel download Page.Response += async (sender, responseCreatedEventArgs) => Handle multiple responses and process the Download await using (var browser = await Puppeteer.LaunchAsync(new LaunchOptions ))Īwait using (var page = await browser.NewPageAsync()) Once I had that particular response, I had to attach headers and cookies for the remote server to send the downloadable data in the response. In essence, before the button click, I had to process multiple responses and handle a single response with the download. I needed both Headers and Cookies set before the download would start. I had a more difficult variation of this, using Puppeteer Sharp.
