Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leak? #25

Open
FDiskas opened this issue Apr 23, 2021 · 17 comments
Open

memory leak? #25

FDiskas opened this issue Apr 23, 2021 · 17 comments
Assignees
Labels
good first issue Good for newcomers question Further information is requested

Comments

@FDiskas
Copy link

FDiskas commented Apr 23, 2021

I was using this module like so:

import httpProxyMiddleware from 'next-http-proxy-middleware';
import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import * as Sentry from '@sentry/node';

import { initSentry } from 'src/utils/sentry';

export default async (req: NextApiRequest, res: NextApiResponse) => {
    const session = getSession(req, res);

    return new Promise((resolve) =>
        httpProxyMiddleware(req, res, {
            target: process.env.NEXT_PUBLIC_API_BASE_URL,
            headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
        }).catch(async (error) => {
            initSentry();
            Sentry.captureException(error);
            await Sentry.flush(2000);

            return resolve(error);
        })
    );
};

And as you can see in memory usage graph it is spikes and some server restarts.
But when I switched to middleware like so

import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import { createProxyMiddleware } from 'http-proxy-middleware';
import * as Sentry from '@sentry/node';

import { initSentry } from 'src/utils/sentry';

export const config = {
    api: {
        bodyParser: false,
        externalResolver: true,
    },
};

export default createProxyMiddleware({
    target: process.env.NEXT_PUBLIC_API_BASE_URL,
    headers: {
        'Content-Type': 'application/json-patch+json',
    },
    selfHandleResponse: false,
    changeOrigin: true,
    xfwd: true,
    secure: true,
    proxyTimeout: 3000,
    ws: true,
    onProxyReq: (proxyReq, req: Request & NextApiRequest, res: Response & NextApiResponse) => {
        proxyReq.setHeader('Authorization', `Bearer ${getSession(req, res)?.idToken}`);
    },
    onError: (err, req: Request & NextApiRequest, res: Response & NextApiResponse) => {
        initSentry();
        Sentry.captureException(err);
        Sentry.flush(2000);

        res.end('Something went wrong.');
    },
});

the memory usage becomes stable. Also got correct response headers from API like 404

image

@stegano stegano self-assigned this Apr 25, 2021
@stegano
Copy link
Owner

stegano commented Apr 25, 2021

Hi @FDiskas 😀

Thanks for reporting the problem.

I tried to reproduce it in my local environment, but it doesn't work.

When using this middleware, did the memory gradually increase, and was the server restarted by OOM?

Can you elaborate on the attached graph?

@FDiskas
Copy link
Author

FDiskas commented Apr 25, 2021

Hi, before the memory was slowly increasing over time. Thous spike drop downs was probably because of the restart. The project is running within a docker container in aws. Did not investigated deeper. Just noticed that during development the local server stops after some time and i need to restart it. After changing to simple middleware everything works without strange crashes.
Thous graphs is taken from aws instance metrics.

@stegano
Copy link
Owner

stegano commented Apr 25, 2021

@FDiskas

Thank you for the explanation. 😀

Could you tell me what version of next-http-proxy-middleware you are currently using for your project?
Prior to v1.0.10, memory leaks may have occurred when using HTTP HEAD, DELETE, CONNECT, OPTIONS, and PUT methods. 😭

@stegano
Copy link
Owner

stegano commented Apr 25, 2021

스크린샷 2021-04-25 오후 11 38 23

스크린샷 2021-04-25 오후 11 38 20

It is not yet accurate, but it seems that system memory accumulates in the inspector when repeatedly requested several times using the ab test.. I'll check a little more. 🤔
Thanks for reporting.

@FDiskas
Copy link
Author

FDiskas commented Apr 25, 2021

As I noticed that there is a new release I updated as soon as possible but didn't noticed any improvement regarding memory

@stegano
Copy link
Owner

stegano commented Apr 25, 2021

As I noticed that there is a new release I updated as soon as possible but didn't noticed any improvement regarding memory

😭

@FDiskas
Copy link
Author

FDiskas commented Apr 25, 2021

Could be related #21

@stegano
Copy link
Owner

stegano commented Apr 25, 2021

If possible, try applying the code like this:
(The problem I've identified is that when a promise(async) is exported, the memory increases with the increase in requests.)

// as-is
export default async (req: NextApiRequest, res: NextApiResponse) => {
    const session = getSession(req, res);

    return new Promise((resolve) =>
        httpProxyMiddleware(req, res, {
            target: process.env.NEXT_PUBLIC_API_BASE_URL,
            headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
        }).catch(async (error) => {
            initSentry();
            Sentry.captureException(error);
            await Sentry.flush(2000);

            return resolve(error);
        })
    );
};
// to-be ✨
export default (req: NextApiRequest, res: NextApiResponse) => {
  const session = getSession(req, res);
  return httpProxyMiddleware(req, res, {
    target: process.env.NEXT_PUBLIC_API_BASE_URL,
    headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
  }).catch(async (error) => {
    initSentry();
    Sentry.captureException(error);
    await Sentry.flush(2000);
  });
};

Test result

4000requests after fix issue code

Request sent 4000 times

// Test code
export default (req: NextApiRequest, res: NextApiResponse) => httpNextProxyMiddleware(req, res, {
  target: 'http://localhost:8080',
}).catch(async (error) => console.log(error));
  • No continuous memory growth

4000request, nested promise

Request sent 4000 times

  • Memory usage is constantly increasing
// Test code
export default async (req: NextApiRequest, res: NextApiResponse) => new Promise(
  (resolve) => httpNextProxyMiddleware(req, res, {
    target: 'http://localhost:8080',
  }).catch(async (error) => resolve(error)),
);

@FDiskas
Copy link
Author

FDiskas commented Apr 25, 2021

I will try at Monday let you know

@stegano stegano added good first issue Good for newcomers question Further information is requested labels Apr 25, 2021
@FDiskas
Copy link
Author

FDiskas commented Apr 26, 2021

Looks like still something is wrong.

import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import httpProxyMiddleware from 'next-http-proxy-middleware';
import * as Sentry from '@sentry/node';

import { initSentry } from 'src/utils/sentry';

export const config = {
    api: {
        bodyParser: false,
        externalResolver: true,
    },
};

export default async (req: NextApiRequest, res: NextApiResponse) => {
    const session = getSession(req, res);

    try {
        return await httpProxyMiddleware(req, res, {
            target: process.env.NEXT_PUBLIC_API_BASE_URL,
            headers: {
                Authorization: `Bearer ${session?.idToken}`,
                'Content-Type': 'application/json-patch+json',
            },
        });
    } catch (error: any) {
        initSentry();
        Sentry.captureException(error);
        await Sentry.flush(2000);

        return error?.message;
    }
};

And after some time I get crash

node_modules/http-proxy/lib/http-proxy/index.js:120
    throw err;
    ^

Error: socket hang up
    at connResetException (internal/errors.js:617:14)
    at TLSSocket.socketCloseListener (_http_client.js:443:25)
    at TLSSocket.emit (events.js:327:22)
    at TLSSocket.EventEmitter.emit (domain.js:486:12)
    at net.js:673:12
    at TCP.done (_tls_wrap.js:563:7) {
  code: 'ECONNRESET'

@stegano
Copy link
Owner

stegano commented Apr 28, 2021

Hi @FDiskas,

Please do not return promise(await function) 😭
Is there any reason to use the async~await?

export default async (req: NextApiRequest, res: NextApiResponse) => { // <-- `fix to export default (req: NextApiRequest, res: NextApiResponse)`
    const session = getSession(req, res);

    try {
        return await httpProxyMiddleware(req, res, { // <-- fix to `return httpProxyMiddleware`
            target: process.env.NEXT_PUBLIC_API_BASE_URL,
            headers: {
                Authorization: `Bearer ${session?.idToken}`,
                'Content-Type': 'application/json-patch+json',
            },
        });
    } catch (error: any) {
        initSentry();
        Sentry.captureException(error);
        await Sentry.flush(2000);

        return error?.message;
    }
};

@FDiskas
Copy link
Author

FDiskas commented Apr 28, 2021

The reason is simple. Eslint rules does that. :) But I will check that later

@FDiskas
Copy link
Author

FDiskas commented May 13, 2021

Tested - still memory is increasing - I used code like so

export const config = {
    api: {
        bodyParser: false,
        externalResolver: true,
    },
};

export default (req: NextApiRequest, res: NextApiResponse) => {
    return httpProxyMiddleware(req, res, {
        preserveHeaderKeyCase: true,
        target: process.env.NEXT_PUBLIC_API_BASE_URL,
        timeout: 3000,
        headers: {
            'Content-Type': 'application/json-patch+json',
        },
    }).catch(async (error) => {
        res.end(error.message);
    });
};

I builded app and then started it using

NODE_OPTIONS='--inspect' next start

attached chrome inspector and I compared between memory snapshots and the main issue is pointing to next-http-proxy-middleware

image

@FDiskas
Copy link
Author

FDiskas commented May 13, 2021

And after reverting it back to http-proxy-middleware I got something better. It is still growing but this time a different plugin.

image

@stegano
Copy link
Owner

stegano commented May 14, 2021

Tested - still memory is increasing - I used code like so

export const config = {
    api: {
        bodyParser: false,
        externalResolver: true,
    },
};

export default (req: NextApiRequest, res: NextApiResponse) => {
    return httpProxyMiddleware(req, res, {
        preserveHeaderKeyCase: true,
        target: process.env.NEXT_PUBLIC_API_BASE_URL,
        timeout: 3000,
        headers: {
            'Content-Type': 'application/json-patch+json',
        },
    }).catch(async (error) => {
        res.end(error.message);
    });
};

I builded app and then started it using

NODE_OPTIONS='--inspect' next start

attached chrome inspector and I compared between memory snapshots and the main issue is pointing to next-http-proxy-middleware

image

JS memory can grow and shrink while the process is running. Over time, unused memory is removed by the GC.
In the case above, is there an error response when requesting through a proxy?
If you are getting an http error response, clear async in the catch clause and try again.

export default (req: NextApiRequest, res: NextApiResponse) => {
    return httpProxyMiddleware(req, res, {
        preserveHeaderKeyCase: true,
        target: process.env.NEXT_PUBLIC_API_BASE_URL,
        timeout: 3000,
        headers: {
            'Content-Type': 'application/json-patch+json',
        },
    }).catch((error) => { // <!-- here
        res.end(error.message);
    });
};

If the catch does not return a promise result, the promise will not be removed by the GC and will persist.

@FDiskas
Copy link
Author

FDiskas commented May 14, 2021

Thank you for good explanation.
In this case no errors in requests was thrown. But as you said probably I need to ignore such small memory changes.
Can we have a good working example in a repository?

@stegano
Copy link
Owner

stegano commented May 16, 2021

Thank you for good explanation.
In this case no errors in requests was thrown. But as you said probably I need to ignore such small memory changes.
Can we have a good working example in a repository?

This library can be used in a variety of ways, making it difficult to give specific examples. However, if a good reference is recorded like this issue, I think I can refer to it later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants