Is it possible to reduce memory RAM consumption when using Selenium GeckoDriver and Firefox

selenium reduce cpu usage
selenium webdriver driver close
python selenium destroy driver
selenium c# driver quit
selenium webdriver cleanup
difference between close and quit selenium python
selenium quit vs dispose
dispose webdriver

I use Selenium and Firefox webdriver with python to scrape data from a website.

But in the code, I need to access this website more than 10k times and it consumes a lot of RAM to do that.

Usually, when the script access this site 2500 times, it already consumes 4gb or more of RAM and it stops to work.

Is it possible to reduce memory RAM consumption without close browser session?

I ask that because when I start the script, I need to log manually on the site(two-factor autentication, the code is not shown below) and if I close the browser session, I will need to log in the site again.

for itemLista in lista:
    driver.get("https://mytest.site.com/query/option?opt="+str(itemLista))

    isActivated = driver.find_element_by_xpath('//div/table//tr[2]//td[1]')
    activationDate = driver.find_element_by_xpath('//div/table//tr[2]//td[2]')

    print(str(isActivated.text))
    print(str(activationDate.text))

    indice+=1
    print("numero: "+str(indice))

    file2.write(itemLista+" "+str(isActivated.text)+" "+str(activationDate.text)+"\n")

#close file
file2.close()

I discover how to avoid the memory leak.

I just use

time.sleep(2)

after

file2.write(itemLista+" "+str(isActivated.text)+" "+str(activationDate.text)+"\n")

Now firefox is working without consumes lots of RAM

It is just perfect.

I don't know exactly why it stopped consumes so much memory, but I think it was growing memory consume because it didn't have time to finish each driver.get request.

1587698, Loading through selenium / geckodriver / marionette the above website fails with for it to finish/crash but seemed to have gone into the much-CPU, much-RAM path) It's great to see that you was finally able to fetch a minidump file. The usual tool we use to get a basic assessment of memory usage is about:memory. I use Selenium and Firefox webdriver with python to scrape data from a website. But in the code, I need to access this website more than 10k times and it consumes a lot of RAM to do that. Usually, when the script access this site 2500 times, it already consumes 4gb or more of RAM and it stops to work.

Firefox memory leak when using FirefoxProfile() · Issue #983 , If I don't use a Firefox profile the problem goes away, unfortuately that goes. FirefoxProfile; import org.openqa.selenium.remote. Can you create a minimal program that still reproduces the issue? So a tracing output of geckodriver, and an export of a verbose memory dump via about:memory would be  If I create a firefox profile (as below) or use an existing profile firefox memory usage rapidly increases over a number of tests until it eventually crashes. If I don't use a Firefox profile the problem goes away, unfortuately that goes against all advice when using firefox and prevents customising firefox.

As mentioned in my comment, only open and write to your file on each iteration instead of keeping it open in memory:

# remove the line file2 = open(...) from your code

for itemLista in lista:
    driver.get("https://mytest.site.com/query/option?opt="+str(itemLista))

    isActivated = driver.find_element_by_xpath('//div/table//tr[2]//td[1]')
    activationDate = driver.find_element_by_xpath('//div/table//tr[2]//td[2]')

    print(str(isActivated.text))
    print(str(activationDate.text))

    indice+=1
    print("numero: "+str(indice))

    with open("your file path here", "w") as file2:
        file2.write(itemLista+" "+str(isActivated.text)+" "+str(activationDate.text)+"\n")

While selenium is quite a memory hungry beast, it doesn't necessarily murder your RAM with each growing iteration. However your growing opened buffer of file2 does take up RAM the more you write to it. Only when it's closed it will release the virtual memory and write the physical.

Why is Firefox eating up all my memory, and how can I stop it , Firefox has been know to have a massive memory leakage problem. Then, find through the remaining list options the key that says network.prefetch-next. 3- Reducing the RAM usage even more when Firefox gets minimized http://​support.mozilla.com/en-US/kb/safe+mode and see how much memory is being taken up  Firefox Using A Lot Of Memory – How To Fix It? To reduce Firefox memory usage, there are a number of ways. I’ve mentioned them all in a step-by-step method here. 1)Restarting Firefox Browser. Whenever you find that Firefox is consuming too much memory, it is better to restart your web browser. It often helps you.

Selenium consumes more ram memory over time : learnpython, Hi, So I'm running scraper that crawls 10k+ pages with Selenium(firefox, chromedriver). As pages are getting scraped I see in both instances memory consumption to go up slightly, but Is this normal behavior or maybe it's memory leak? There might be a legitimate fix with Selenium, but have you thought of using a lighter  Using RamMap to empty system working set – This free tool by Microsoft can show metafile usage and find which RAM areas are used by which applications. It’s a great tool to address high memory usage on Windows 10 and you have no idea why the issue exists.

[PDF] investigating selenium usage challenges and reducing the , very resource consuming to run a load test using Selenium as one would need to run His guidance and mentorship allowed me to get this thesis in shape. cusses the high memory consumption of Chrome instance and the overall CPU con- modern browsers such as Google Chrome, Mozilla Firefox, and Safari. Actually, if you are new to Selenium and you have started directly with Selenium 3.0, you would not know how Firefox was launched with the previous versions of Selenium (version 2.53 and before). It was a pretty straight forward process where you were not required to use GeckoDriver or any other driver.

Headless Selenium Browsers, Is it possible to reduce memory RAM consumption wh I use Selenium and Firefox webdriver with python to scrape data from a website. Incase the GeckoDriver and Firefox processes are still not destroyed and removed  What is worse, this leak can be “hidden” in virtual memory – “minimize memory usage” on about:memory will sweep “private bytes” into “virtual memory”. When virtual memory reaches your real ram size, firefox (or even entire system) starts to be slugish, and on 32bit systems may crash.

Comments
  • Maybe instead of keeping file2 open, only open it and write it once per iteration? It seems like the culprit is the growing size of file2 in your buffer.
  • Did you consider Headless Firefox or PhantomJS or HTMLUnit browsers as an option?
  • @ DebanjanB I think to use a headless browser it is not an option for me, because when I access the site, I need to put a password on it. Because the site is protected by a two-factor password that I receive on my email each time that I try to access.
  • I'm curious, can you get performance graphs from your OS? One that'll track the browser's process, and your script; it'll help you nail down which is causing the memory usage hike. (I'm mostly curious cause I'd love the see the browser's one :D, its behavior during 10k navigations is very interesting.)
  • use Chrome it use less memory.
  • Unfortunately I can't use driver.quit() because it will destroy the web session. I need the web session because when I run the script I need to manually inut a two-factor password. If I set the script to use driver.quit() after 2000 times, when I restart the driver I will need to put the password again. But just like you said, I think there is not other solution to this problem.
  • In that case you can invoke driver.close() and forcefully kill the Firefox browser instances.
  • I made the change that you suggest, but unfortunately it didn't fix my problem. The firefox browser stills consume a lot of RAM. I think even if the [code]file2[/code] is still open, it didn't affect too much the RAM usage.
  • Odd. Is your driver creating a new instance of the browser each time? On Chrome I notice that in the processes it creates a new chrome.exe but is quickly killed off and the RAM usage is in check. Not sure how it functions for the firefox driver. If anything I'd guess the RAM rises at each driver.get()... if that is so you could consider create a new driver and close it each iteration, but it's probably more time consuming.
  • This solution will not decrease the memory usage - python does not hold in-memory the file content that has been written up until now; it has a buffer for what is pending writing - the only thing that's in the RAM re: your concern, and ths buffer's default size in most OS is just a single line. In fact this approach leads to bad/unexpected result - in every itteration the file is re-created with just the last line; e.g. now the script will not store all values, but just the last.
  • @Todor Minakov You are right, only the last line is saved.
  • @TodorMinakov Good point, I shouldn't have used w mode. I thought the buffer would take up virtual memory but maybe I'm mistaken. Thanks for pointing it out.