Atlassian Bitbucket SonarQube Pipe error: “Container ‘docker’ exceeded memory limit”

I’m using Bitbucket Pipelines for my builds. A previously working SonarQube analysis step is now failing with the error “Container ‘docker’ exceeded memory limit.” This started happening after our codebase increased in size.

This is my configuration:

image: atlassian/default-image:2

clone:
  depth: full       # include the last five commits

definitions: 
  docker:
    memory: 8192
  caches:
    sonar: ~/.sonar
  steps:
    - step: &run_sonar_qube_anaysis
        name: "Run Sonar Qube Analysis"  
        size: 2x # Double resources available for this step.
        script:
          - pipe: sonarsource/sonarqube-scan:1.0.0
            variables:
              SONAR_HOST_URL: ${SONAR_HOST_URL}
              SONAR_TOKEN: ${SONAR_TOKEN}
              SONAR_SCANNER_OPTS: -Xmx8192m
              EXTRA_ARGS: -Dsonar.php.coverage.reportPaths=test-reports/unittestscoverage.xml,test-reports/integrationtestscoverage.xml -Dsonar.php.tests.reportPath=test-reports/unittestsjunit.xml,test-reports/integrationtestsjunit.xml
        artifacts:
          - test-reports/**

I’ve already tried the following to resolve the issue:

  • Increased Docker service memory to 16384 MB (in definitions).
  • Increased the SonarQube step size to 4x.
  • Set SONAR_SCANNER_OPTS to -Xmx16384m.
  • Reduced the number of commits being processed to 1.

Despite these changes, the error persists. What else can I try to resolve this memory issue in my Bitbucket Pipeline’s SonarQube step? Are there specific memory optimization techniques for SonarQube within Bitbucket Pipelines that I should be aware of? Could the increased codebase size be directly impacting the memory requirements of the underlying services?