Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Profiling a Bud service #372

Open
Fuerback opened this issue Feb 1, 2023 · 5 comments
Open

Memory Profiling a Bud service #372

Fuerback opened this issue Feb 1, 2023 · 5 comments
Labels
waiting for info We're waiting for a response

Comments

@Fuerback
Copy link

Fuerback commented Feb 1, 2023

I'm wondering if there is a way to do a Memory Profiling in a Bud service because I'm noticing a possible memory leak.

I'm reading about the native Golang package pprof, but to do this analysis is necessary to have access to the main package or routers.

The image below shows the service is using more and more memory, it doesn't stop.
Screenshot from 2023-02-01 09-04-21

I was using the 0.2.5 version and recently updated it to 0.2.8, but it happens in both versions.

@matthewmueller
Copy link
Contributor

matthewmueller commented Feb 2, 2023

Hey @Fuerback, thanks for doing this research! I'd say it's still a bit early days for memory optimization in Bud, but I did see rogchap/v8go#367 the other day, which is potentially related.

Are you experiencing out of memory (OOM) errors? It might just build up and then garbage collection triggers after a bit.

If you dig any deeper, please share your findings!

@Fuerback
Copy link
Author

Fuerback commented Feb 2, 2023

Hey @matthewmueller, I'm not experiencing any OOM errors.

I'll share if I find something interesting.

Thanks for the quick answer!

@matthewmueller matthewmueller added the waiting for info We're waiting for a response label Feb 6, 2023
@dobarx
Copy link

dobarx commented Feb 8, 2023

V8 isolate is never closed:

func Eval(path, code string) (string, error) {

@dobarx
Copy link

dobarx commented Feb 8, 2023

Also, if reusing isolates, value should be released after Eval. This could also leak memory. New v8go release has:

defer value.Release()

I've had a similar problem on experiment side project where I tried to write simpler svelte renderer in Go. I wanted to reuse isolates for better performance. But memory usage would just spike up since rendered html would not get cleared from memory.

Now I tried adding value.Release() and it seems it solved this issue.

@matthewmueller
Copy link
Contributor

matthewmueller commented Feb 9, 2023

@dobarx, thanks for your research!

Right now we are re-using the isolates and context for the duration of the running process, but they do get cleaned up on close:

bud/package/js/v8/v8.go

Lines 133 to 138 in bff7b1f

func (vm *VM) Close() error {
vm.context.Close()
vm.isolate.TerminateExecution()
vm.isolate.Dispose()
return nil
}

This feels like the right approach so only need to load these svelte templates once, then when rendering HTML, it's just calling those functions over and over again. I haven't had the time to do any more in-depth research on this topic yet.

FWIW, my thinking around keeping context around is that since we're running developer-controlled scripts, we're not concerned about security issues that would otherwise crop up from user-run scripts. It's a bit faster too IIRC.

You are right that we aren't currently releasing the value:

bud/package/js/v8/v8.go

Lines 113 to 131 in bff7b1f

func (vm *VM) Eval(path, expr string) (string, error) {
value, err := vm.context.RunScript(expr, path)
if err != nil {
return "", err
}
// Handle promises
if value.IsPromise() {
prom, err := value.AsPromise()
if err != nil {
return "", err
}
// TODO: this could run forever
for prom.State() == v8go.Pending {
continue
}
return prom.Result().String(), nil
}
return value.String(), nil
}

value.Release(), does seem promising.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
waiting for info We're waiting for a response
Projects
None yet
Development

No branches or pull requests

3 participants