Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to connect to Medusa - Log shows everything is running as usual #11645

Open
astronyu opened this issue Mar 13, 2024 · 9 comments
Open

Comments

@astronyu
Copy link

astronyu commented Mar 13, 2024

My copy of pyMedusa suddenly showing "unable to connect to Medusa". Not sure what happened as it has been running fine prior to this. This pyMedusa was a docker installed.

The docker is on a different machine than all the media. Volume for the media is linked as writable CIFS.

I think this issue was caused by a "storage check" function on the script where it could not get the available storage due to the media volume being a network drive although this is my guess from the log, I'm no dev.

2024-03-13T10:21:28.106356156Z     yielded = self.gen.throw(*exc_info)  # type: ignore
2024-03-13T10:21:28.106358680Z   File "/app/medusa/medusa/server/api/v2/base.py", line 64, in async_call
2024-03-13T10:21:28.106361275Z     content = yield IOLoop.current().run_in_executor(executor, prepared)
2024-03-13T10:21:28.106363630Z   File "/app/medusa/ext/tornado/gen.py", line 762, in run
2024-03-13T10:21:28.106366074Z     value = future.result()
2024-03-13T10:21:28.106368299Z   File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
2024-03-13T10:21:28.106370723Z     result = self.fn(*self.args, **self.kwargs)
2024-03-13T10:21:28.106372997Z   File "/app/medusa/medusa/server/api/v2/config.py", line 571, in get
2024-03-13T10:21:28.106375302Z     config_data[section] = DataGenerator.get_data(section)
2024-03-13T10:21:28.106377536Z   File "/app/medusa/medusa/server/api/v2/config.py", line 647, in get_data
2024-03-13T10:21:28.106379880Z     return getattr(cls, 'data_' + section)()
2024-03-13T10:21:28.106382115Z   File "/app/medusa/medusa/server/api/v2/config.py", line 1131, in data_system
2024-03-13T10:21:28.106384439Z     section_data['diskSpace'] = generate_location_disk_space()
**2024-03-13T10:21:28.106386673Z   File "/app/medusa/medusa/queues/utils.py", line 119, in generate_location_disk_space
2024-03-13T10:21:28.106388998Z     'freeSpace': get_disk_space_usage(app.TV_DOWNLOAD_DIR)**
2024-03-13T10:21:28.106391222Z   File "/app/medusa/medusa/helpers/__init__.py", line 1376, in get_disk_space_usage
2024-03-13T10:21:28.106393546Z     st = os.statvfs(disk_path)
2024-03-13T10:21:28.106395740Z BlockingIOError: [Errno 11] Resource temporarily unavailable: '/downloads/tv'
2024-03-13T10:48:20.990902793Z 2024-03-13 18:48:20 ERROR    TORNADO :: [c5a1e37] Uncaught exception in APIv2: BlockingIOError(11, 'Resource temporarily unavailable')
2024-03-13T10:48:20.990950643Z Request: GET /api/v2/config/ (172.27.0.1)
2024-03-13T10:48:20.990954931Z Traceback (most recent call last):
2024-03-13T10:48:20.990958478Z   File "/app/medusa/ext/tornado/web.py", line 1704, in _execute
2024-03-13T10:48:20.990961934Z     result = await result
2024-03-13T10:48:20.990965010Z   File "/app/medusa/ext/tornado/gen.py", line 769, in run
2024-03-13T10:48:20.990968266Z     yielded = self.gen.throw(*exc_info)  # type: ignore
2024-03-13T10:48:20.990971261Z   File "/app/medusa/medusa/server/api/v2/base.py", line 64, in async_call
2024-03-13T10:48:20.990974457Z     content = yield IOLoop.current().run_in_executor(executor, prepared)
2024-03-13T10:48:20.990977453Z   File "/app/medusa/ext/tornado/gen.py", line 762, in run
2024-03-13T10:48:20.990980429Z     value = future.result()
2024-03-13T10:48:20.990983304Z   File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
2024-03-13T10:48:20.990986320Z     result = self.fn(*self.args, **self.kwargs)
2024-03-13T10:48:20.990989205Z   File "/app/medusa/medusa/server/api/v2/config.py", line 571, in get
2024-03-13T10:48:20.990992201Z     config_data[section] = DataGenerator.get_data(section)
2024-03-13T10:48:20.990995086Z   File "/app/medusa/medusa/server/api/v2/config.py", line 647, in get_data
2024-03-13T10:48:20.990998061Z     return getattr(cls, 'data_' + section)()
2024-03-13T10:48:20.991000907Z   File "/app/medusa/medusa/server/api/v2/config.py", line 1131, in data_system
2024-03-13T10:48:20.991003902Z     section_data['diskSpace'] = generate_location_disk_space()
2024-03-13T10:48:20.991006818Z   File "/app/medusa/medusa/queues/utils.py", line 119, in generate_location_disk_space
2024-03-13T10:48:20.991009803Z     'freeSpace': get_disk_space_usage(app.TV_DOWNLOAD_DIR)
2024-03-13T10:48:20.991012659Z   File "/app/medusa/medusa/helpers/__init__.py", line 1376, in get_disk_space_usage
2024-03-13T10:48:20.991015634Z     st = os.statvfs(disk_path)
2024-03-13T10:48:20.991018480Z BlockingIOError: [Errno 11] Resource temporarily unavailable: '/downloads/tv'
2024-03-13T10:48:31.904624584Z 2024-03-13 18:48:31 ERROR    TORNADO :: [c5a1e37] Uncaught exception in APIv2: BlockingIOError(11, 'Resource temporarily unavailable')
2024-03-13T10:48:31.904661955Z Request: GET /api/v2/config/ (172.27.0.1)
2024-03-13T10:48:31.904666123Z Traceback (most recent call last):
2024-03-13T10:48:31.904669440Z   File "/app/medusa/ext/tornado/web.py", line 1704, in _execute
2024-03-13T10:48:31.904673066Z     result = await result
2024-03-13T10:48:31.904676002Z   File "/app/medusa/ext/tornado/gen.py", line 769, in run
2024-03-13T10:48:31.904679018Z     yielded = self.gen.throw(*exc_info)  # type: ignore
2024-03-13T10:48:31.904681893Z   File "/app/medusa/medusa/server/api/v2/base.py", line 64, in async_call
2024-03-13T10:48:31.904685220Z     content = yield IOLoop.current().run_in_executor(executor, prepared)
2024-03-13T10:48:31.904702172Z   File "/app/medusa/ext/tornado/gen.py", line 762, in run
2024-03-13T10:48:31.904705348Z     value = future.result()
2024-03-13T10:48:31.904708164Z   File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
2024-03-13T10:48:31.904711159Z     result = self.fn(*self.args, **self.kwargs)
2024-03-13T10:48:31.904713965Z   File "/app/medusa/medusa/server/api/v2/config.py", line 571, in get
2024-03-13T10:48:31.904716860Z     config_data[section] = DataGenerator.get_data(section)
2024-03-13T10:48:31.904719655Z   File "/app/medusa/medusa/server/api/v2/config.py", line 647, in get_data
2024-03-13T10:48:31.904722521Z     return getattr(cls, 'data_' + section)()
2024-03-13T10:48:31.904725276Z   File "/app/medusa/medusa/server/api/v2/config.py", line 1131, in data_system
2024-03-13T10:48:31.904728172Z     section_data['diskSpace'] = generate_location_disk_space()
2024-03-13T10:48:31.904734454Z   File "/app/medusa/medusa/queues/utils.py", line 119, in generate_location_disk_space
2024-03-13T10:48:31.904737500Z     'freeSpace': get_disk_space_usage(app.TV_DOWNLOAD_DIR)
2024-03-13T10:48:31.904740305Z   File "/app/medusa/medusa/helpers/__init__.py", line 1376, in get_disk_space_usage
2024-03-13T10:48:31.904743180Z     st = os.statvfs(disk_path)
2024-03-13T10:48:31.904745916Z BlockingIOError: [Errno 11] Resource temporarily unavailable: '/downloads/tv'

a few lines after that then I saw these:

2024-03-13T10:31:08.644872600Z 2024-03-13 18:31:08 DEBUG    POSTPROCESSOR :: [c5a1e37] Starting new thread: POSTPROCESSOR
2024-03-13T10:31:08.878554232Z 2024-03-13 18:31:08 INFO     POSTPROCESSOR :: [c5a1e37] Processing path: /downloads/tv
2024-03-13T10:31:08.929215327Z 2024-03-13 18:31:08 DEBUG    POSTPROCESSOR :: [c5a1e37] No processable items found in folder: /downloads/tv
2024-03-13T10:31:08.944488304Z 2024-03-13 18:31:08 DEBUG    POSTPROCESSOR :: [c5a1e37] No resource_name passed, using path [/downloads/tv/Resident Alien] to process as a source folder
2024-03-13T10:31:08.953664813Z 2024-03-13 18:31:08 DEBUG    POSTPROCESSOR :: [c5a1e37] Processing folder: /downloads/tv/Resident Alien
2024-03-13T10:31:08.955476953Z 2024-03-13 18:31:08 DEBUG    POSTPROCESSOR :: [c5a1e37] Packed files detected: ['97c219b1760148e488c5443b046e15a6.part01.rar']

So it can see the folder and contents within the folder.

Can someone help?

@KimuzukashiiKuma
Copy link

I've seem to have the same problem.
Only a few days ago the server worked normal. Then I can't connect to Medusa.
It seems that in the background Medusa works as normal; show are being downloaded, but I can't see the show-list.
Like you all of my media is connected via CIFS to the server. It has a similar error at startup:

[2024-04-11 16:57:49 ERROR	TORNADO :: [34a67cf] Uncaught exception in APIv2: BlockingIOError(11, 'Resource temporarily unavailable')
Request: GET /api/v2/config/ (192.168.178.130)
...
BlockingIOError: [Errno 11] Resource temporarily unavailable: '/mnt/samba/TV.series/!NIEUW']

Did your problem resolve itself or did you have to do something else?

@jaxjexjox
Copy link

This problem has just kicked in for me, I had a networking issue last night, so I thought it was that.

I've fixed that issue and now re-spun a fresh docker container of medusa AND I've rebooted the ubuntu server host AND I've rebooted the proxmox host which was running the ubuntu VM

Nope and nope, still the same issue.

I'm also connected to a CIFS server?

@jaxjexjox
Copy link

I have attempted to roll back to an earlier docker image but from what I can tell, this didn't work either.
So at this point I'm going to need to hold off and hope something gets fixed.

@StudioEtrange
Copy link

+1

@StudioEtrange
Copy link

StudioEtrange commented Apr 14, 2024

It seems to be a bug in cifs,
https://bugs.launchpad.net/ubuntu/+source/linux/+bug/2060780
https://bugs.launchpad.net/ubuntu/+source/cifs-utils/+bug/2060797
this error is not linked to medusa

@jaxjexjox
Copy link

I am a bit confused because my docker configuration data, for medusa is located on my local filesystem?
Yes, the media, is on another, CIFS filesystem but medusa is indicating it can't even 'connect to itself' ? - based on the pics, it can't see settings, presumably located in config.ini on the local filesystem?

@jaxjexjox
Copy link

It seems to be a bug in cifs, https://bugs.launchpad.net/ubuntu/+source/linux/+bug/2060780 https://bugs.launchpad.net/ubuntu/+source/cifs-utils/+bug/2060797 this error is not linked to medusa

Well

Regardless of my skills and assumptions! You're correct and my confusion is irrelevant!
I have applied this exact fix to my ubuntu server:
https://old.reddit.com/r/Ubuntu/comments/1bzshdt/ubuntu_2204_smb_shares_stopped_working_lastnight/kzcfks9/

It totally fixed the problem so I really appreciate it.

Not a medusa fault, odd for it to 'fall over' due to SMB / CIFS issues just for the data folders but she seems ok. Really appreciate it, thank you.

@StudioEtrange
Copy link

@jaxjexjox i have done the same fix than you
Do you know how to revert back to the default linux kernel ? i dont know how to

@jaxjexjox
Copy link

I have no idea but I have thanked and asked the person on reddit, for their advice for this next month.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants