Skip to content

Memory usage with large datasets #109

@adbussy

Description

@adbussy

Dear Team,

First of all, thank you very much for this great package.

I have an issue when trying to run multisynth() with a large(r) dataset.

It looks like R is trying to save an enormous vector to memory, and the estimation breaks.

Is there a way within the current capabilities of the package to lower memory usage?

Many thanks, and best wishes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions