Skip to content

Lettuce clientOptions config & NioSocketChannel may trigger memory leak #3286

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
adar-v opened this issue May 3, 2025 · 2 comments
Open

Comments

@adar-v
Copy link

adar-v commented May 3, 2025

Bug Report

Current Behavior

I use spring webflux and lettuce. When I want to set the lettuce worker thread not on the io thread, my service will find that the memory surges every once in a while, and the service will become unresponsive and stuck after a while.

Stack trace
lettuce-nioEventLoop-7-1
  at sun.nio.ch.EPoll.wait(IJII)I (EPoll.java(Native Method))
  at sun.nio.ch.EPollSelectorImpl.doSelect(Ljava/util/function/Consumer;J)I (EPollSelectorImpl.java:118)
  at sun.nio.ch.SelectorImpl.lockAndDoSelect(Ljava/util/function/Consumer;J)I (SelectorImpl.java:129)
  at sun.nio.ch.SelectorImpl.select()I (SelectorImpl.java:146)
  at io.netty.channel.nio.SelectedSelectionKeySetSelector.select()I (SelectedSelectionKeySetSelector.java:68)
  at io.netty.channel.nio.NioEventLoop.select(J)I (NioEventLoop.java:813)
  at io.netty.channel.nio.NioEventLoop.run()V (NioEventLoop.java:460)
  at io.netty.util.concurrent.SingleThreadEventExecutor$4.run()V (SingleThreadEventExecutor.java:997)
  at io.netty.util.internal.ThreadExecutorMap$2.run()V (ThreadExecutorMap.java:74)
  at io.netty.util.concurrent.FastThreadLocalRunnable.run()V (FastThreadLocalRunnable.java:30)
  at java.lang.Thread.run()V (Thread.java:833)

Input Code

Input Code Old configuration mode, no memory leak occurs
    @Bean
    public ReactiveRedisConnectionFactory reactiveRedisConnectionFactory() {
        return new LettuceConnectionFactory(aliRedisProperties.getHost(), aliRedisProperties.getPort());
    }

New configuration mode, will trigger memory leak occurs

    @Bean
    public ReactiveRedisConnectionFactory reactiveRedisConnectionFactory() {
        LettuceClientConfiguration configuration = LettuceClientConfiguration.builder()
                .clientOptions(ClientOptions.builder().publishOnScheduler(true).build())
                .build();
        return new LettuceConnectionFactory(new RedisStandaloneConfiguration(aliRedisProperties.getHost(), aliRedisProperties.getPort()), configuration);
    }

Expected behavior/code

Environment

  • spring framework: 2.6.9
  • Lettuce version(s): 6.1.8.Release
  • Redis version: 5.0.6
  • Netty: 4.1.78

Possible Solution

Additional context

memory dump

Image
one instance of io.netty.channel.socket.nio.NioSocketChannel loaded by org.springframework.boot.loader.LaunchedURLClassLoader @ 0x6458005b0 occupies 2,443,638,760 (91.93%) bytes. The memory is accumulated in one instance of java.lang.Object[], loaded by <system class loader>, which occupies 2,443,636,752 (91.93%) bytes.

Thread io.netty.util.concurrent.FastThreadLocalThread @ 0x645bb4888 lettuce-nioEventLoop-7-1 has a local variable or reference to sun.nio.ch.EPollSelectorImpl @ 0x645bbe688 which is on the shortest path to java.lang.Object[550104] @ 0x6e43aa798. The thread io.netty.util.concurrent.FastThreadLocalThread @ 0x645bb4888 lettuce-nioEventLoop-7-1 keeps local variables with total size 29,232 (0.00%) bytes.

Significant stack frames and local variables

sun.nio.ch.EPollSelectorImpl.doSelect(Ljava/util/function/Consumer;J)I (EPollSelectorImpl.java:118)
sun.nio.ch.EPollSelectorImpl @ 0x645bbe688 retains 832 (0.00%) bytes
sun.nio.ch.SelectorImpl.lockAndDoSelect(Ljava/util/function/Consumer;J)I (SelectorImpl.java:129)
sun.nio.ch.EPollSelectorImpl @ 0x645bbe688 retains 832 (0.00%) bytes
@tishun
Copy link
Collaborator

tishun commented May 7, 2025

Hey @adar-v ,

thanks for reporting this issue. Could you help us reproduce this so we can solve it?
Can you build up a minimum reproducible example that showcases the issue, or is it very random?
How often does it happen?

@adar-v
Copy link
Author

adar-v commented May 15, 2025

  1. I found that after changing to another connection mode, it also happens, but the frequency seems to have decreased.
  2. This problem occurs very randomly, and no time characteristics or behavioral characteristics have been found, so a demo cannot be provided.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants