Trying to understand PHP Garbage Collection [duplicate]

Forgive my ignorance but I’ve not played around with PHP’s garbage collection before. Specifically, I’m running a CakePHP 5.x Command Script and running into memory exhausted issues.

As a test to see how I can clear some memory, I’m trying to run gc_collect_cycles(); on this part of the code:

$connection = ConnectionManager::get('default');
    $this->logMessage('Memory usage before query: ' . memory_get_usage(true) / 1024 / 1024 . 'MB');
    $studentList = $connection->execute("SELECT DISTINCT
                stu.STU_ID AS person_id
        FROM SMS.S1STU_DET AS stu
        LEFT JOIN Pulse.$studentsTableName AS students
            ON students.person_id = stu.STU_ID
        LEFT JOIN Pulse.$coursesTableName AS courses
            ON courses.person_id = stu.STU_ID
        LEFT JOIN Pulse.$unitsTableName AS units
            ON units.person_id = stu.STU_ID
        LEFT JOIN Pulse.$rawCasesTableName AS cases
            ON cases.person_id = stu.STU_ID
        WHERE   1 = 1
            AND (
                students.person_id IS NOT NULL
                OR
                courses.person_id IS NOT NULL
                OR
                units.person_id IS NOT NULL
                OR
                cases.person_id IS NOT NULL
            )
    ")->fetchAll('assoc');

    $this->logMessage('Memory usage after query: ' . memory_get_usage(true) / 1024 / 1024 . 'MB');

    unset($studentList);
    gc_collect_cycles();

    $this->logMessage('Memory usage after garbage collection: ' . memory_get_usage(true) / 1024 / 1024 . 'MB');
    exit();

And this is the output I get:

2025-02-23 11:32:54 - Memory usage before query: 8MB
2025-02-23 11:32:57 - Memory usage after query: 48MB
2025-02-23 11:32:57 - Memory usage after garbage collection: 44MB

As you can see gc_collect_cycles() didn’t find anything to clean up (it drops to 44MB regardless if I run gc_collect_cycles() or not. So I’m obviously not understanding and/or using this correctly. Is there anyway I can free up memory to get close to the starting 8MB again?

Modify main WordPress query to limit results to a list of possible Post IDs

I am working on a WprdPress/WooCommerce plugin feature where specific Customers are shown only a subset of Products.
The user has a meta data field with an array of post ids corresponding to the allowed Products.

I am trying to hook into the main query:

add_action( 'pre_get_posts', 'customprefix_pre_get_posts' );

function customprefix_pre_get_posts( $query )
{
  $user = wp_get_current_user();

  if ( $query->is_main_query() && is_product_category() )
  {
    /** @var array $allowed Post IDs */
    $allowed = customprefix_get_allowed_products_per_user( $user );

    $query->set( 'post__in', $allowed );
  }
}

I have tried many different combinations of how to add the post__in clause to the query including $query->query_vars['post__in'] = $allowed and others. I have tried adding $query->parse_query_vars() as well with no success.

While diving further into the get_posts() function in WP_Query it appears that my change to the Query is returned correctly (by reference) from the pre_get_posts hook. The next line: $q = $this->fill_query_vars( $q ); somehow looses my custom post__in field.

Most of the documentation for WP_Query and post__in revolve around creating a new query, not modifying the main query.

How to make a background bash script pause while an npm build runs?

I’m setting up a new method of deploying my personal react website, which is using vite for bundling. My idea is:

  1. I push un-bundled changes from local to remote repo
  2. a web hook informs my server (which has a listening php script)
  3. the php script triggers a bash script that: 1) runs an npm install, 2) runs an npm run build and 3) runs an rsync to copy the build contents to the web’s live folder

when I run the bash script manually via PuTTY, it works fine. I can watch while vite’s build output scrolls in the terminal, and eventually the rsync is triggered. But when the bash script is run via the php script, it does not seem to wait for the build (and perhaps even the install) to complete. It just copies whatever is in the build folder over to the live site (which is always the result of the last build, because the current build has not completed yet).

Is there a way to make the bash script wait for the install and the build to complete? I’ve tried adding a pause 120 after the install and build statements, and I’ve tried chaining the commands, for example something like

npm install && npm run build && rsync ......

neither accomplishes the goal when the script is run via php, but both work perfectly when I run manually..

The relevant line from the php script that calls the bash script is:

$commandOutput = shell_exec("/bin/bash ./sync.sh --repo=" . $hook["repo"] . " --branch=" . $hook["branch"] . " --subdomain=" . $subdomain);

$hook["repo"] is the name of the repo, $hook["branch"] is the current branch, and $subdomain is the subdomain being worked on (I have multiple set up).

The bash script starts out by picking up the repo, branch, and subdomain and then makes the various git and npm commands. The relevant lines are:

if [[ ${repo} && ${branch} && ${subdomain} ]]; then
  git -C /path/"${repo}"/ checkout "${branch}"
  git -C /path/"${repo}"/ fetch
  git -C /path/"${repo}"/ pull origin "${branch}"
  npm --prefix /path/"${repo}"/ install
  npm --prefix /path/"${repo}"/ run build;
  rsync -auv /path/"${repo}"/htdocs/* /path/"${subdomain}"/htdocs;
else
  echo "repo, branch and subdomain were not provided, I will not do anything.";
fi

Redis streams memory leak?

It is really long to explain why I use streams for simple task (and I can’t change it)
but at the moment I save multiple datas into different keys which are associated with streams.
After data is got from the stream with XRANGE I need to completely destroy the data,key.
Yes, just one access to the stream and everything has to be destroyed.
I tried everything. I tried expire 0 on the key, del, xtrim maxlen of 0 (and then del) and nothing works and my memory keeps getting bigger until exhaustion.
Even if 1 key (stream) is used, XRANGEd then deleted same memory leak applies.
What i see is

Memory

used_memory:27305504
used_memory_human:26.04M
used_memory_rss:17321984
used_memory_rss_human:16.52M
used_memory_peak:27349928
used_memory_peak_human:26.08M
used_memory_peak_perc:99.84%
used_memory_overhead:681540
used_memory_startup:660608
used_memory_dataset:26623964
used_memory_dataset_perc:99.92%
allocator_allocated:29209872
allocator_active:38076416
allocator_resident:38535168
total_system_memory:2050424832
total_system_memory_human:1.91G
used_memory_lua:37888
used_memory_lua_human:37.00K
used_memory_scripts:0
used_memory_scripts_human:0B
number_of_cached_scripts:0
maxmemory:33554432
maxmemory_human:32.00M
maxmemory_policy:volatile-lru
allocator_frag_ratio:1.30
allocator_frag_bytes:8866544
allocator_rss_ratio:1.01
allocator_rss_bytes:458752
rss_overhead_ratio:0.45
rss_overhead_bytes:-21213184
mem_fragmentation_ratio:0.64
mem_fragmentation_bytes:-9944392
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:20496
mem_aof_buffer:0
mem_allocator:jemalloc-5.2.1
active_defrag_running:0
lazyfree_pending_objects:0

Using PHP redis to do Redis->del($key) etc.

I can’t use ordinary key to store the data (when del there is no leak).
Redis version 6.0.16

Symfony PHPUnit tests fail in parallel in GitHub Actions

I’m encountering an issue with my PHPUnit paratest configuration… or maybe my CI, Docker, or even my code — I’m not sure.

I have some tests for my Symfony web application, and I’m facing a problem where the user does not remain logged in. When I log in the user using $this->client->login(), no errors occur, and the assertion following the login confirms success. However, as soon as I make a subsequent request, the user appears to be logged out and is redirected to the login page.

If this were happening with every request, I’d have a better idea of where to start troubleshooting, but it only occurs intermittently. Additionally, this issue only happens in the GitHub Actions runner. When I run the tests locally, everything works fine — whether I use paratest or plain PHPUnit.

However, I recently discovered that if I run the tests sequentially on CI, everything works as expected.

Has anyone experienced this issue or know of a fix?

Here is the section of my checks.yml which triggers the test

tests:
  name: Tests
  runs-on: ubuntu-latest
  steps:
    - name: Check out code
      uses: actions/checkout@v4

    - name: Docker Compose Up
      run: docker compose up -d

    - name: Wait for services
      run: sleep 30

    - name: Setup Test Environment
      run: |
        docker compose exec app bash -c "
          rm -rf var/cache/*
          mkdir -p var/cache/test
          chmod -R 777 var/
          composer dump-autoload
          php bin/console cache:clear --env=test
          APP_ENV=test php bin/console cache:warmup
          chmod -R 777 var/
        "

    - name: Paratest
      run: |
        docker compose exec app bash -c "
          APP_ENV=test vendor/bin/paratest --configuration=phpunit.xml.dist --runner=WrapperRunner --testdox --verbose
          vendor/bin/phpunit --configuration=phpunit.xml.dist
        "

    - name: Docker Compose Down
      if: always()
      run: docker compose down

And here is an example test, here the test testActive always fails…

<?php

namespace AppTestsController;

use AppEntityTutorProfile;
use AppEntityUser;
use AppFactoryPermissionFactory;
use AppFactoryStudentProfileFactory;
use AppFactoryTutorProfileFactory;
use AppFactoryUserFactory;
use DoctrineORMEntityManagerInterface;
use SymfonyBundleFrameworkBundleKernelBrowser;
use SymfonyBundleFrameworkBundleTestWebTestCase;
use ZenstruckFoundryTestFactories;
use ZenstruckFoundryTestResetDatabase;

final class StudentControllerTest extends WebTestCase
{
    use Factories;
    use ResetDatabase;

    private KernelBrowser $client;
    private EntityManagerInterface $entityManager;
    private User $user;
    private TutorProfile $profile;

    protected function setUp(): void
    {
        $this->client = static::createClient();
        $this->entityManager = static::getContainer()->get(EntityManagerInterface::class);

        $this->user = UserFactory::createOne()->_real();
        $this->profile = TutorProfileFactory::createOne(['user_entity' => $this->user])->_real();
        $permission = PermissionFactory::createOne(['name' => 'CAN_VIEW_STUDENT_LIST'])->_real();
        $this->profile->addPermission($permission);
        $this->entityManager->flush();

        $this->client->loginUser($this->user);
        $this->client->request('GET', '/select-profile');
        $this->client->followRedirects();
    }

    public function testActive(): void
    {
        StudentProfileFactory::createMany(5, [
            'archived' => false,
            'deactivated' => false,
            'school' => $this->profile->getSchool(),
        ]);

        $this->client->request('GET', '/students');

        $this->assertResponseIsSuccessful();
        $this->assertSelectorTextContains('body > div.flex.min-h-screen > div.flex-1.flex.flex-col > main > div > h2', 'Schülerliste');

        $this->assertCount(5, $this->client->getCrawler()->filter('tbody tr'));
    }

    public function testArchived(): void
    {
        StudentProfileFactory::createMany(5, [
            'archived' => true,
            'deactivated' => false,
            'school' => $this->profile->getSchool(),
        ]);

        $this->client->request('GET', '/students/archived');

        $this->assertResponseIsSuccessful();
        $this->assertSelectorTextContains('body > div.flex.min-h-screen > div.flex-1.flex.flex-col > main > div > h2', 'Schülerliste');

        $this->assertCount(5, $this->client->getCrawler()->filter('tbody tr'));
    }

    public function testDeactivated(): void
    {
        StudentProfileFactory::createMany(5, [
            'archived' => false,
            'deactivated' => true,
            'school' => $this->profile->getSchool(),
        ]);

        $this->client->request('GET', '/students/deactivated');

        $this->assertResponseIsSuccessful();
        $this->assertSelectorTextContains('body > div.flex.min-h-screen > div.flex-1.flex.flex-col > main > div > h2', 'Schülerliste');
    }

    # ...

}

And here is the error:

1) AppTestsControllerStudentControllerTest::testActive
Failed asserting that SymfonyComponentDomCrawlerCrawler Object &0000000000000bf90000000000000000 (
    'defaultNamespacePrefix' => 'default'
    'namespaces' => Array &0 ()
    'cachedNamespaces' => ArrayObject Object &0000000000000f200000000000000000 ()
    'baseHref' => 'http://localhost/login'
    'document' => DOMDocument Object &0000000000000f240000000000000000 ()
    'nodes' => Array &1 (
        0 => DOMElement Object &0000000000000f220000000000000000 (
            'schemaTypeInfo' => null
        )
    )
    'isHtml' => true
    'html5Parser' => MastermindsHTML5 Object &0000000000000aab0000000000000000 (
        'defaultOptions' => Array &2 (
            'encode_entities' => false
            'disable_html_ns' => true
        )
        'errors' => Array &3 ()
    )
    'uri' => 'http://localhost/login'
) matches selector "body > div.flex.min-h-screen > div.flex-1.flex.flex-col > main > div > h2" and the Crawler has a node matching selector "body > div.flex.min-h-screen > div.flex-1.flex.flex-col > main > div > h2".
/var/www/html/vendor/symfony/framework-bundle/Test/DomCrawlerAssertionsTrait.php:44
/var/www/html/tests/Controller/StudentControllerTest.php:54

Laravel OpenAI Streaming (SSE) returns broken words instead of full sentences

I am using Laravel and OpenAI to generate blog content in real-time using Server-Sent Events (SSE). However, instead of full sentences, the output is coming in broken words, which makes the generated text unreadable.

For example, instead of:

"Freelance work is a great way to become independent."

I get:

"Free lan ce work is a gre at way to be come in dep end ent."

It seems like the response is not properly handling tokenized streaming, causing words to be split incorrectly.

What I Have Tried

1.Ensured correct SSE headers

  • My Laravel response already includes:
return response()->stream(function () { ... }, 200, [
    'Content-Type' => 'text/event-stream',
    'Cache-Control' => 'no-cache',
    'Connection' => 'keep-alive',
    'X-Accel-Buffering' => 'no',
]);

Current Laravel Controeller:

<?php

namespace AppHttpControllersDashboardBlogAI;

use AppHttpControllersController;
use IlluminateHttpRequest;
use IlluminateSupportFacadesLog;
use AppServicesOpenAIService;
use AppModelsAIDocument;
use AppModelsAIBlog;

class AIWriterController extends Controller
{
    protected $openAIService;

    public function __construct(OpenAIService $openAIService)
    {
        $this->openAIService = $openAIService;
    }

    public function index()
    {
        Log::info("AIWriterController@index çağrıldı.");

        $models = config('openai.models');
        $tones = config('openai.tones');
        $languages = config('languages.supported');
        $settings = $this->openAIService->getSettings();

        return view('dashboard.blog.ai.index', compact('models', 'tones', 'languages', 'settings'));
    }

    public function generate(Request $request)
    {
        $validated = $request->validate([
            'article_title' => 'required|string',
            'focus_keywords' => 'nullable|string',
            'exclude_keywords' => 'nullable|string',
            'default_language' => 'required|string',
            'default_tone' => 'required|string',
            'max_words' => 'required|integer|min:50|max:2000',
        ]);

        $prompt = "Başlık: " . $validated['article_title'] . "n";
        $prompt .= "Anahtar Kelimeler: " . $validated['focus_keywords'] . "n";
        $prompt .= "Hariç Tutulacak: " . $validated['exclude_keywords'] . "n";
        $prompt .= "Ton: " . $validated['default_tone'] . "n";
        $prompt .= "Dil: " . $validated['default_language'] . "n";

        return $this->openAIService->generateStreamedText($prompt, $validated['max_words']);
    }

    public function save(Request $request)
    {
        Log::info("save metodu çalıştırıldı.", $request->all());

        $validated = $request->validate([
            'title' => 'required|string',
            'description' => 'required|string',
        ]);

        Log::info("save için doğrulama tamamlandı.", $validated);

        AIDocument::create([
            'title' => $request->title,
            'content' => $request->description,
            'user_id' => auth()->id(),
        ]);

        Log::info("Makale başarıyla kaydedildi.");

        return response()->json(['message' => 'Makale başarıyla kaydedildi.']);
    }

    public function publish(Request $request)
    {
        Log::info("publish metodu çalıştırıldı.", $request->all());

        $validated = $request->validate([
            'title' => 'required|string',
            'description' => 'required|string',
        ]);

        Log::info("publish için doğrulama tamamlandı.", $validated);

        AIBlog::create([
            'title' => $request->title,
            'content' => $request->description,
            'user_id' => auth()->id(),
            'status' => 'published',
        ]);

        Log::info("Makale başarıyla yayınlandı.");

        return response()->json(['message' => 'Makale başarıyla yayınlandı.']);
    }
}

My service file:

<?php

namespace AppServices;

use GuzzleHttpClient;
use IlluminateSupportFacadesLog;
use SymfonyComponentHttpFoundationStreamedResponse;
use AppServicesOpenAISettingsService;

class OpenAIService
{
    protected $client;
    protected $settingsService;
    protected $settings;

    public function __construct(OpenAISettingsService $settingsService)
    {
        $this->client = new Client();
        $this->settingsService = $settingsService;
        $this->settings = $this->settingsService->getSettings();
    }

    public function getSettings()
    {
        return $this->settings;
    }

    public function generateStreamedText($prompt, $maxTokens)
    {
        if (!$this->settings->api_key) {
            Log::error("OpenAI API anahtarı eksik!");
            return response()->json(['error' => 'OpenAI API anahtarı eksik!'], 400);
        }

        try {
            $jsonData = [
                'model'  => $this->settings->default_model,
                'messages' => [
                    ['role' => 'system', 'content' => 'Sen bir blog yazarı asistansın.'],
                    ['role' => 'user', 'content' => $prompt],
                ],
                'max_tokens' => (int) $maxTokens,
                'temperature' => 0.7,
                'stream' => true,
            ];

            $response = $this->client->post('https://api.openai.com/v1/chat/completions', [
                'headers' => [
                    'Authorization' => "Bearer " . $this->settings->api_key,
                    'Content-Type'  => 'application/json',
                ],
                'json' => $jsonData,
                'stream' => true,
            ]);

            return response()->stream(function () use ($response) {
                header('Content-Type: text/event-stream');
                header('Cache-Control: no-cache');
                header('Connection: keep-alive');
                header('X-Accel-Buffering: no');

                $buffer = "";
                $body = $response->getBody();
                while (!$body->eof()) {
                    $chunk = trim($body->read(4096));

                    if (!empty($chunk)) {
                        $lines = explode("n", $chunk);
                        foreach ($lines as $line) {
                            if (strpos($line, "data: ") === 0) {
                                $json = json_decode(substr($line, 5), true);
                                if (isset($json['choices'][0]['delta']['content'])) {
                                    $text = trim($json['choices'][0]['delta']['content']);

                                    $buffer .= $text;

                                    if (preg_match('/s$/', $text) || mb_strlen($buffer) > 6) {
                                        echo "event: updaten";
                                        echo "data: " . $buffer . "nn";
                                        ob_flush();
                                        flush();
                                        $buffer = "";
                                    }
                                }
                            }
                        }
                    }
                }

                echo "event: updaten";
                echo "data: <END_STREAMING_SSE>nn";
                ob_flush();
                flush();
            }, 200, [
                'Content-Type' => 'text/event-stream',
                'Cache-Control' => 'no-cache',
                'Connection' => 'keep-alive',
                'X-Accel-Buffering' => 'no',
            ]);
        } catch (Exception $e) {
            Log::error("OpenAI API Hatası: " . $e->getMessage());
            return response()->json(['error' => 'API isteğinde hata oluştu.'], 500);
        }
    }
}

My js code (included in blade)

<script>
    document.addEventListener("DOMContentLoaded", function () {
        console.log("JavaScript yüklendi ve çalışıyor!");

        document.getElementById('generate-blog').addEventListener('click', function() {
            console.log("Blog oluşturma butonuna basıldı!");

            let formData = new FormData(document.getElementById('ai-blog-form'));
            let outputDiv = document.getElementById('streamedData');
            outputDiv.innerHTML = ""; // Önceki içeriği temizle

            fetch("{{ route('dashboard.blog.ai.generate') }}", {
                method: "POST",
                body: formData,
                headers: {
                    "X-CSRF-TOKEN": document.querySelector('meta[name="csrf-token"]').getAttribute("content")
                }
            })
            .then(response => {
                console.log("Fetch response geldi:", response);
                if (!response.ok) {
                    throw new Error(`HTTP error! Status: ${response.status}`);
                }
                return response.body.getReader();
            })
            .then(reader => {
                let decoder = new TextDecoder();

                function readStream() {
                    reader.read().then(({ done, value }) => {
                        if (done) {
                            console.log("Stream tamamlandı.");
                            return;
                        }

                        let text = decoder.decode(value, { stream: true });
                        console.log("Gelen veri:", text);

                        let lines = text.split("n");

                        lines.forEach(line => {
                            if (line.startsWith("data: ")) {
                                try {
                                    let kelime = line.replace("data: ", "").trim();
                                    // Her parçadan sonra boşluk ekleyerek yazıyoruz
                                    outputDiv.innerHTML += kelime + " ";
                                    console.log("Ekrana yazılan kelime:", kelime);
                                } catch (e) {
                                    console.error("JSON parse hatası:", e, line);
                                }
                            }
                        });

                        readStream();
                    }).catch(error => console.error("Stream okuma hatası:", error));
                }

                readStream();
            })
            .catch(error => console.error("Fetch hatası:", error));
        });
    });
</script>

Routes:

Route::prefix('blog/ai')->group(function () {
        Route::get('/', [AIWriterController::class, 'index'])->name('dashboard.blog.ai.index');
        Route::post('/generate', [AIWriterController::class, 'generate'])->name('dashboard.blog.ai.generate');
        Route::post('/save', [AIWriterController::class, 'save'])->name('dashboard.blog.ai.save');
        Route::post('/publish', [AIWriterController::class, 'publish'])->name('dashboard.blog.ai.publish');
    });

wkhtmltopdf options : How to handle Non-English file name across different OS systems

I’m encountering issues with wkhtmltopdf when passing non-English file names as arguments. For example passing the file named Japan日本.xml, the output file is saved as Japan.xml missing non-english characters instead of Japan日本.xml. This problem occurs on my Ubuntu server, while my client PC runs Windows. I’ve ensured that the necessary fonts are installed on the server, and the locale settings are configured to support UTF-8. However, the issue persists.
I also found similar issue reported in wkhtmltopdf-Repository.

PHP Code Snippet:

<?php
$WKHTMLTOPDF = $wkhtmltopdf_path;
$_options = [
    '--dump-outline ' . escapeshellarg('/usr/local/apache2/htdocs/Japan日本.xml'),
    $input_file_path,
    $output_file_path
];

$options_string = implode(' ', $_options);
$output = shell_exec('"' . $WKHTMLTOPDF . '" ' . $options_string);
?>

Also, if you notice how I pass the option parameters, sometimes we forget to add a space between the command option and its value, which causes issues. It would be helpful if you could explain how to keep proper spacing between command options.

How to correctly structure a request for PhonePe Auto-Debit API (/v3/recurring/debit/init)?

I am integrating PhonePe’s Auto-Debit API (/v3/recurring/debit/init) but keep receiving the following error response:

{ 
  "status": "error", 
  "message": "Please check the inputs you have provided. [message = Incorrect Request.]" 
}

Payload Request is:

{
  "merchantId": "MID12345",
  "merchantUserId": "U123456789",
  "subscriptionId": "OMS2006110139450123456789",
  "transactionId": "TX1234567890",
  "autoDebit": true,
  "amount": 39900
}

X-VERIFY, payload format correct or wrong?
I have followed the documentation and ensured the request format is correct, but I still get this error. I need clarification on whether my request payload is missing any required fields or if the X-VERIFY signature is incorrect.

How should I structure the request properly to avoid this error? Any guidance would be appreciated.

  • Ensured transactionId is unique for every request.
  • Verified amount is in paisa format (e.g., ₹399.00 = 39900).
  • Tested in Postman using hardcoded values.
  • Checked subscriptionId format to ensure correctness.
  • Generated X-VERIFY hash according to PhonePe’s documentation.
  • Logged request payload to confirm correct structure.

Expected Outcome: I expected a successful API response with transaction details.

Here is the minimal reproducible example for my issue:

$apiKey = 'your_salt_key';
$baseUrl = '``https://mercury-t2.phonepe.com``';
$subscriptionId = 'SUB123456';
$transactionId = 'TX' . strtoupper(bin2hex(random_bytes(8))); // Generate unique transaction ID
$amount = 10000; // Amount in paise (₹100)
$callurl = "https://yourdomain.com/api/phonepe/callback"

// Request payload
$authPayload_1 = [
  'merchantId' => 'your_merchant_id',
  'merchantUserId' => 'your_merchant_id',
  'subscriptionId' => $subscriptionId,  
  'transactionId' => $transactionId,
  'autoDebit' => true,
  'amount' => $amount,
];
$base64Payload = base64_encode(json_encode($authPayload_1));
$payloadHash_1 = $base64Payload . '/v3/recurring/debit/init' . $apiKey;
$checksum_a = hash('sha256', $payloadHash_1);
$xVerifyy = $checksum_a . '###' . 'your_salt_index';

$curl = curl_init();
curl_setopt_array($curl, [
  CURLOPT_URL => $baseUrl . '/v3/recurring/debit/init',
  CURLOPT_RETURNTRANSFER => true,
  CURLOPT_POST => true,
  CURLOPT_POSTFIELDS => json_encode(['request' => $base64Payload]),
  CURLOPT_HTTPHEADER => [
    'Content-Type: application/json',
    'X-VERIFY: ' . $xVerifyy,
    'X-CALLBACK-URL: '.$callurl

  ],
]);

$response = curl_exec($curl);
$httpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
curl_close($curl);

$responseData = json_decode($response, true);
var_dump($responseData);```
   
          

Keycloak logout

I am new to Keycloak, but I have set up a Keycloak server, the installation, configuration, and access to my web application.

Authentication goes through a custom SPI for read-only access to a database managed by our web application.
No proxy, just a web app using ExtJS and PHP on Apache2 with mod_auth_openid.

The problem — something I didn’t think would be this complicated to me — is understanding what a logout is and how to do it with Keycloak.

Despite, I’ve already read some of the documentation and examples on the subject, a lot of them are new to me, I feel like I’ve tried everything, and I still don’t understand the expected behavior.

I was deleting/clearing cookies, but it does not seem to be working with the Keycloak cookies (both in JS and PHP).

document.cookie.split(";").forEach(cookie => {
    document.cookie = cookie.replace(/^ +/, "")
    .replace(/=.*/, 
            "=;expires=Thu, 01 Jan 1970 00:00:00 UTC;path=/");
});
foreach($_COOKIE as $name => $value) {
        setcookie($name, '', time()-3600, '/');
}
header("Location: /");

I’m looking forward to work with the logout URI :
“https://1.1.1.1/realms/dev/protocol/openid-connect/logout”

I saw some example with what it seems to be a depreciated method to me, /auth/.

Whether I call it via POST or GET, manually through my browser or in my PHP, with or without parameters, it does not log the user out.

Here is the PHP code I am using to try the logout part:


$realm = "dev";
$keycloak_base_url = "https://1.1.1.1:8443";
$redirect_uri = "https://1.1.1.1/app/";
$client_id = "mySuperclient";
$client_secret = "mySuperSecret";


$refresh_token = $_SERVER['OIDC_refresh_token'];
$access_token = $_SERVER['OIDC_access_token'];
$logout_url = "$keycloak_base_url/realms/$realm/protocol/openid-connect/logout";

$data = [
    'client_id' => $client_id,
    'refresh_token' => $refresh_token,
];

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $logout_url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, http_build_query($data));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);

$response = curl_exec($ch);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
$curl_error = curl_error($ch);
curl_close($ch);

if ($http_code === 204) {
    echo "Successful! Redirecting...";
    header("Location: /app/");  
    exit();
} else {
    echo "Failed! HTTP Code: $http_code <br>";
    echo "<pre>" . htmlspecialchars($response) . "</pre><br>";
}

OS : Debian 12.9
PHP version: 8.2.26
Keycloak : 26.1.0 (bare metal install)
mod-auth-openidc :2.4.12.3

I don’t really know, but everything seems to be good, the client, scopes and redirect uris works, all that remains is the logout part…

EDIT1: The use of

header ("Location: $logout_url");

redirects me to a Keycloak logout confirmation page. Even with a manual validation, I still can access my web app without login.

I think I have to add some data to that logout redirect, but can’t figure out what is necessary.

Markdown snippets in HydePHP

I’d like to do something like this in my Blade pages:

<div class="accordion-content">
    @mdsnippet('snippets/common/some-content-for-multiple-pages.md')
</div>

and then _snippets/common/some-content-for-multiple-pages.md contains:

## here's some content

which Hyde writes out to the output directory as:

<div class="accordion-content">
    <h2>here's some content</h2>
</div>

Is this possible? It looks like we can add Blade in Markdown files:

https://hydephp.com/docs/1.x/advanced-markdown

But I’m wanting to do the reverse – Markdown in Blade files.

Does upgrading jQuery causes problem in cookies and session? [closed]

In a website with PHP 8.2 currently using jQuery 3.3.1. It sets cookie SameSite => None, Secure => true by session_set_cookie_params() and website have SSL. The session.cookie_lifetime is set for 2 days.

The website uses payment gateway and Samesite None is required to continue session when user returns from payment gateway. Everything is working fine in website with jQuery 3.3.1 but it has some vulnerability. Hence I changed to jQuery 3.7.1.

Everything in website fine with jQuery 3.7.1, but session ends when user returns from payment gateway. I tried cleaning browser data after jQuery update and loading website but this problem continues. Website still using SameSite => None, Secure => true. These are some queries that I have.

  1. Is jQuery 3.7.1 responsible for ending of user session while returning from third-party sites?
  2. Whether Samesite cookie needs to be set differently if using jQuery 3.7.1?
  3. Do I have to wait two days after jQuery upgrade (since session.cookie_lifetime is 2 days) to confirm this problem about user session ending?
  4. Is possible to use jQuery 3.3.1 but avoid its vulnerability?

Woocommerce One of The Attribute Filter From Variable Product To Show The Variation on Product Product Loop as Simple Product

I have setup a variable product on woocommerce backend, The below image is one of the product variations

one of the product

As you can see to come up with a single variation of products needs 2 attributes: Colour (white-cream-neautral and brwon-gold-orange-yellow) and Colourways (whitecreamneautural1,whitecreamneautural2,whitecreamneautural3,whitecreamneautural4,whitecreamneautural5,BrownGoldOrangeYellow1). By the way, there are more than 8 attributes in 1 product but for the variations is only use 2 attributes.

I want to create a filter based on the attributes so when user click one of the attributes in the shop page it will show the parent/variable product. However, I want if user click the colour attributes I want it show the variations product not the parent/variable product. In the product example image above if I click white-cream-neautral filter button it will show 5 products with it’s own link, images, prices, etc. if I click brown-gold-orange-yellow filter button it will show 1 product just as how i setup in the backend. Is that possible in woocommerce?

This is the website that have this filter https://instyle.com.au/products/ try click the colour filter, it will break the variable product into it’s variations.

phpseclib rijndael ECB 256 bit key problem moving from PHP5 to PHP8

Have a legacy app that moved to Debian 11 stuck with PHP 8.2 (PHP 5.6 no longer available for use). Original application utilized the first version of phpseclib but after being moved to new php 8.2 machine, no longer works, nothing in the log, it just returns unrecognizable data upon encryption calls.

This works on the old server it was moved from (works on PHP5, but not PHP8):

$rijndael = new Crypt_Rijndael(CRYPT_RIJNDAEL_MODE_ECB);

$rijndael->setKey('akeyof32btyeslongabcdefghijklmop');
$keylen = 256;
$rijndael->setKeyLength($keylen);
$rijndael->setBlockLength($keylen);

$decrypted1 = $rijndael->decrypt($EncryptedDataOf256bytes);

Tried upgrading to phpseclib3 with PHP 8.2 – data also returned unrecognizable.

$rijndael1 = new phpseclib3CryptRijndael('ecb');
$keylen = 256;
$rijndael->setKey('akeyof32btyeslongabcdefghijklmop');
$rijndael1->setKeyLength($keylen);
$rijndael1->setBlockLength($keylen);
$rijndael1->disablePadding();         // tried with and without padding, 

$decrypted1 = $rijndael1->decrypt($bindata)

Looking into phpseclib3 it looks to have support for rijndael ECB 256 bit keys but the earlier version phpseclib didn’t, so not even sure at this point how it worked.

Tried everything, need advice on where to look next.

Points allocation based on Time – Laravel [closed]

I am working on a sports carnival application to remove the need for complex excel workbooks, as the new person in charge of these events has little excel skills.

While I can add a result record for a heat into the DB that is not an issue, I need to be able to lookup and allocate points based on their time.

The database would look as such:

Result Table
|ID|Name|Squadron_id|event|competition_id|heat|time|heat_position|overall_position|points|

Points table
|ID|position|points_allocated|
 1      1st         10
 2      2nd         8
 3      3rd         6
 4      4th         4
 5      5th         2
 6      6th         1

What I have no idea on how to approach is how to allocate the overall position and the points in the result table.
These will need to be updated as additional results come in for that event.

Also, is there a way which I can add multiple records versus one at a time?