Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyter lab running on cluster uses up all available RAM when multiple notebooks are opened. #1455

Open
joshdorrington opened this issue Aug 28, 2024 · 0 comments
Labels

Comments

@joshdorrington
Copy link

Hi, I'm cross-posting an issue I raised on the main Jupyterlab page as I hope you guys might have some ideas about how to debug this issue.
Original thread: jupyterlab/jupyterlab#16719

Description

I am running Jupyter lab (v4.2.3) on a linux cluster, initialised as:
'nohup jupyter lab --port=1027 --no-browser &'

I have noticed a persistent issue, where when I open multiple notebooks (<10), memory usage of Jupyterlab climbs to >100Gb and the server becomes unresponsive. I have to manually kill it from command line. I don't even have to run any cells of code for this to happen, so it is not linked to the actual contents of my notebooks.

Reproduce

I can reproduce this by opening a bunch of blank notebooks in different folders, and maybe running some basic code within them (e.g. a=1+1). Within a few minutes this high memory issue occurs. If I limit myself to 1 notebook this does not happen.

Expected behavior

Memory usage should be linked to the code I execute within notebooks.

Context

Operating system (of laptop in which browser window is open) Ubuntu 22.04.4 LTS
Operating system (of remote server on which jupyter lab is running) Red Hat Enterprise Linux Server VERSION="7.9 (Maipo)"

Browser and version: Firefox v129.02
JupyterLab version: 4.2.3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant