Is there a way to allow scripts from outside sources when making a chrome extension?

I’m making a chrome extension that relies heavily on outside scripts and APIs. I’m trying to use the content-security-policy rule in the manifest file to allow scripts from things like jsdelivr and the such.

This is my manifest file, where the scripts are being allowed. I also need to be able to execute inline code.

{
    "manifest_version": 3,
    "name": "Solvem Probler",
    "description": "It prolbs all your solvems.",
    "version": "1.0",
    "content_security_policy": [
        "script-src 'self' "URL HERE"; object-src 'self'",
    ],
    "action": {
      "default_popup": "index.html",
      "default_icon": "spicon.png"
    },
    "permissions" : [
        "activeTab",
        "scripting"
    
    ]
}

This is my HTML, where I link those scripts:

<!DOCTYPE html>
<html>
    <head>
        <meta name=viewport content="width=device-width,initial-scale=1">
        <meta charset="utf-8"/>
        <script src="URL HERE"></script>
    </head>

Is there anything I’m doing wrong?

P.S. I had to trim down the URLS greatly because stack overflow was marking this as spam.

EDIT: When debugging the unpacked extension, it returns the following error: Invalid value for 'content_security_policy'. Could not load manifest.

providesTags of Redux-Toolkit Query is returning an error at the first request, while the succeeding request are saying success

I am trying to implement pagination for the test results page.

undefined testResultsApiSlice.jsx:33
testResultsList testResultsApiSlice.jsx:34
GET http://localhost:3500/testResults? 400 (Bad Request) apiSlice.jsx:19
{ids: Array(3), entities: {…}, totalPages: 2, currentUser: {…}} testResultsApiSlice.jsx:33 (2)
[‘testResultsList’, {…}] testResultsApiSlice.jsx:34 (2)

Above is the result of console logging result (testResultsApiSlice.jsx:33) and arg (testResultsApiSlice.jsx:34) in transformResponse of getTestResults.
when i navigate to the TestResultsList page, the first request returns result as undefined and arg as just testResultsList (chatgpt says that the params which are currentPage, limit, and username are undefined when this request was made so that’s why this is happening) after that fail request it says GET http://localhost:3500/testResults? 400 (Bad Request) apiSlice.jsx:19
then a second later it will refetch and say this in the console log

{ids: Array(3), entities: {…}, totalPages: 2, currentUser: {…}} testResultsApiSlice.jsx:33 (2)
[‘testResultsList’, {…}] testResultsApiSlice.jsx:34 (2)

which make it seems like everything is fine, however, no items is being shown in the list and when i check the redux dev tools, the no data is in the testResultsList and it is showing an error message saying that ‘Username not found’ which i think this came from the first failed request. whenever there’s a refetchOnFocus, It will continue to say the error ‘Username not found’ in redux dev tools even though it is saying this in the console log

{ids: Array(3), entities: {…}, totalPages: 2, currentUser: {…}} testResultsApiSlice.jsx:33 (2)
[‘testResultsList’, {…}] testResultsApiSlice.jsx:34 (2)

also, the process above will repeat whenever i change the page in pagination

TestResultsList.jsx

const TestResultsList = () => {
  const { username } = useAuth();

  const [selectedTestResultId, setSelectedTestResultId] = useState(null);
  const [isEditing, setIsEditing] = useState(false);
  const [search, setSearch] = useState('');
  const [currentPage, setCurrentPage] = useState(1);
  const itemsPerPage = 10;

  const {
    data: testResults,
    isLoading,
    isSuccess,
    isError,
    error,
  } = useGetTestResultsQuery(
    ['testResultsList', { page: currentPage, limit: itemsPerPage, username }],
    {
      pollingInterval: 60000,
      refetchOnFocus: true,
      refetchOnMountOrArgChange: true,
    }
  );

apiSlice for testResults

import { createEntityAdapter, createSelector } from '@reduxjs/toolkit';
import { apiSlice } from '../../app/api/apiSlice';

const testResultsAdapter = createEntityAdapter({});

const initialState = testResultsAdapter.getInitialState();

export const testResultsApiSlice = apiSlice.injectEndpoints({
  endpoints: (builder) => ({
    getTestResults: builder.query({
      query: ([, { page, limit, username }]) => ({
        url: `/testResults`,
        params: { page, limit, username },
        validateStatus: (response, result) => {
          return response.status === 200 && !result.isError;
        },
      }),
      transformResponse: (responseData) => {
        const loadedTestResults = responseData.testResults.map((testResult) => {
          testResult.id = testResult._id;
          return testResult;
        });
        return {
          ...testResultsAdapter.setAll(initialState, loadedTestResults),
          totalPages: responseData.totalPages,
          currentUser: responseData.currentUser,
        };
      },
      providesTags: (result, error, arg) => {
        console.log(result);
        console.log(arg);

        if (result?.ids) {
          return [
            { type: 'TestResult', id: 'LIST' },
            ...result.ids.map((id) => ({ type: 'TestResult', id })),
          ];
        } else return [{ type: 'TestResult', id: 'LIST' }];
      },
    }),
    addNewTestResult: builder.mutation({
      query: (formData) => ({
        url: '/testResults',
        method: 'POST',
        body: formData,
      }),
      invalidatesTags: [{ type: 'TestResult', id: 'LIST' }],
    }),
    updateTestResult: builder.mutation({
      query: (formData) => ({
        url: '/testResults',
        method: 'PATCH',
        body: formData,
      }),
      invalidatesTags: (result, error, arg) => [
        { type: 'TestResult', id: arg.id },
      ],
    }),
    deleteTestResult: builder.mutation({
      query: ({ id }) => ({
        url: '/testResults',
        method: 'DELETE',
        body: { id },
      }),
      invalidatesTags: (result, error, arg) => [
        { type: 'TestResult', id: arg.id },
      ],
    }),
  }),
});

export const {
  useGetTestResultsQuery,
  useAddNewTestResultMutation,
  useUpdateTestResultMutation,
  useDeleteTestResultMutation,
} = testResultsApiSlice;

export const selectTestResultsResult =
  testResultsApiSlice.endpoints.getTestResults.select();

const selectTestResultsData = createSelector(
  selectTestResultsResult,
  (testResultsResult) => testResultsResult.data
);

export const {
  selectAll: selectAllTestResults,
  selectById: selectTestResultById,
  selectIds: selectTestResultIds,
} = testResultsAdapter.getSelectors(
  (state) => selectTestResultsData(state) ?? initialState
);

backend controller for getting all test results

const getAllTestResults = async (req,res) => {
  const { page, limit, username } = req.query;
  const skip = (page - 1) * limit;

  const currentUser = await User.findOne({ username })
    .select('-password')
    .lean()
    .exec();

  if (!currentUser) {
    return res.status(400).json({ message: 'Current user not found' });
  }

  const totalTestResults = await TestResult.countDocuments({ user: currentUser._id });
  const totalPages = Math.ceil(totalTestResults / limit);

  const testResults = await TestResult.find({ user: currentUser._id })
    .skip(skip)
    .limit(limit)
    .lean();

  res.status(200).json({
    totalPages,
    testResults,
    currentUser
  });
}

How to make footer sticky to bottom position?

enter image description hereI am trying to make my footer sticky to the bottom of a section. I am using position: fixed, but it is still not working.

In other words, I have a section with a width of 30%. I want to add form elements in that section I also need to add a sticky button at the bottom of that section that only covers the 100% width of the section or 30% of page width.

Is that feasible?

enter image description here

body {
  height: 100vh
}

.fixed-footer {
  position: fixed;
  bottom: 0;
  left: 0;
  width: 100%;
  z-index: 1300;
  background: #ffffff;
  padding: 16px 42px;
  box-shadow: 0px -3px 4px rgba(0, 0, 0, 0.25);
  border-top: 1px solid #bebec1;
}
<div style="width:30%;height:100%;border:1px solid">
  <div class="MuiBox-root css-11b71wf" data-testid="create-contract-content">
    <div class="wrapper-container css-1rnkd8n MuiBox-root css-0">
      <div class="title-container css-1112he MuiBox-root css-0">
        <h2 class="MuiTypography-root MuiTypography-body1 css-1ecwa68-MuiTypography-root">Add New Contract</h2>
      </div>
      <form novalidate="">
        <div class="MuiGrid-root MuiGrid-container box css-crtrjo-MuiGrid-root">
        </div>
        <div class="fixed-footer css-imiwo4 MuiBox-root css-0">
          <div class="css-gg4vpm MuiBox-root css-0">
            <div>
              <button class="MuiButtonBase-root MuiButton-root MuiButton-text MuiButton-textPrimary MuiButton-sizeMedium MuiButton-textSizeMedium MuiButton-root MuiButton-text MuiButton-textPrimary MuiButton-sizeMedium MuiButton-textSizeMedium base-btn ,undefined css-wpz0ho-MuiButtonBase-root-MuiButton-root" tabindex="0" type="button" dpw-variant="secondary">CANCEL<span class="MuiTouchRipple-root css-8je8zh-MuiTouchRipple-root"></span></button>
              <button class="MuiButtonBase-root MuiButton-root MuiButton-text MuiButton-textPrimary MuiButton-sizeMedium MuiButton-textSizeMedium MuiButton-root MuiButton-text MuiButton-textPrimary MuiButton-sizeMedium MuiButton-textSizeMedium base-btn ,undefined css-14tsggr-MuiButtonBase-root-MuiButton-root" tabindex="0" type="submit" dpw-variant="primary">CREATE<span class="MuiTouchRipple-root css-8je8zh-MuiTouchRipple-root"></span></button>
            </div>
          </div>
        </div>
      </form>
    </div>
  </div>
</div>

update if I going with position relative and botton one position absolute. button goes up when scroll

TypeScript path alias not working with Vite

Hey guys I have created simple TypeScript project using the Vite CLI. Unfortunately when doing so Vite didn’t generate a vite.config that I could use to add path aliases to my project.

What I want to achive:

I just want to use “@/” instead of “./src/“.

My tsconfig.json:

{
  "compilerOptions": {
    "target": "ES2020",
    "useDefineForClassFields": true,
    "module": "ESNext",
    "lib": ["ES2020", "DOM", "DOM.Iterable"],
    "skipLibCheck": true,

    /* Bundler mode */
    "moduleResolution": "bundler",
    "allowImportingTsExtensions": true,
    "resolveJsonModule": true,
    "isolatedModules": true,
    "noEmit": true,
    "moduleDetection": "force",

    /* Linting */
    "strict": true,
    "noUnusedLocals": true,
    "noUnusedParameters": true,
    "noFallthroughCasesInSwitch": true,

    /* Path alias */
    "baseUrl": ".",
    "paths": {
      "@/*": ["./src/*"]
    }
  },
  "include": ["src", "types/global.d.ts"]
}

Again there is no vite.config as I used this CLI command to generate the project:

bun create vite my-ts-app --template vanilla-ts

I tried adding path alias option to my tsconfig.json file and restarted both VSCode and TS Server but with no success:

    /* Path alias */
    "baseUrl": ".",
    "paths": {
      "@/*": ["./src/*"]
    }

In Rails 7, how to render Chartkick chart with a Javascript function inside the options?

I want to use Chartkick to visualize data. My basic code looks like this:

<%= line_chart [{ name: "Weather forecast",
                      data: @dataset
               }],
               {
                 defer: true,
               }
%>

In this case, the basic chart is render – all good.

Now, I would like to modify the labels on the y-axis:

<%= line_chart [{ name: "Weather forecast",
                  data: @dataset
               }],
               {
                 defer: true,
                 yaxis: {
                   labels: {
                   formatter: function(val, opts) {
                                return val;
                              }
                   }
                },
        }
%>

And here comes the problem – executing this code results in the following error:

undefined local variable or method `val' for #<ActionView::Base:0x00000000150590>

I also tried the following modification:

<%= line_chart [{ name: "Weather forecast",
                  data: @dataset
               }],
               {
                 defer: true,
                 yaxis: {
                   labels: {
                   formatter: raw("function(val, opts) {
                                return val;
                              }")
                   }
                },
        }
%>

Which removed the previous error and rendered page successfully, BUT without the chart and in the browser console is the following error:

Uncaught (in promise) TypeError: this.w.config.yaxis[e].labels.formatter is not a function

How do I fix this? The purpose of the code in the formatter block is to format the numbers.

I am using vue 3 to connect to supabase database and pull records into a table. I am having some trouble deleting a row using a click event

I am new to vue 3 and I am following some online vue 3 training:)
so I get records back from the supabase database, the table the table gets populated with the exepected data. The last column has an action button with a delete button for every row. The challenge I am having is that When I click the delete button record nothing gets deleted. I tried few suggestions online but no luck.

Here is my template:

<template>
    <a-table :data-source="posts">
    <a-table-column-group >
      <a-table-column key="name" data-index="fullName">
        <template #title><span style="color: #1890ff">Name</span></template>
      </a-table-column>
      <a-table-column key="amount" title="Amount" data-index="amount" />
      <a-table-column key="recieved_date" title="recieved_date" data-index="recieved_date" />
       <a-table-column key="description" title="Description" data-index="description" /> 
    <a-table-column key="id" title="ID" data-index="id" /> 
    <a-table-column key="action" title="Action">
        <span>      
         <AButton @click="deleteRow(row.id)">Delete</AButton> 
          </span>
         </a-table-column>
   </a-table-column-group>
  </a-table>
</template>
<script setup>
const columns = [{
  name: 'Name',
  dataIndex: 'fullName',
  key: 'fullName',
}, {
  title: 'recieved_date',
  dataIndex: 'recieved_date',
  key: 'recieved_date',
},  {
  title: 'Amount',
  key: 'amount',
  dataIndex: 'amount',
}, 
{
  title: 'Description',
  dataIndex: 'description',
  key: 'description',
},
{
  title: 'id',
  dataIndex: 'id',
  key: 'id',
 
 },
{
  title: 'Action',
  dataIndex: 'action',
  key: 'action',
 
 },
];

  //delete record
    
     const deleteRow = async () => {
    
    const { data,index } =
    await supabase.from("Contractor_Payment")
    .delete()
    .eq("id",row.id)
     
  onMounted(() => {
    fetchData()
  })
</script>

How can you ensure that you can only enter numbers, and optionally decimals?

I have an Input-component in Blazor. I only want to input digits and optional decimals.

And I get an error: “Unhandled exception rendering component: event.preventDefault is not a function”

  • The component must show numbers with dots in the thousands.

  • And commas with decimals.

  • And a decimal number must be stored.

    @inherits InputNumber<decimal?>
     @using System.Globalization
     @using System.Text.RegularExpressions
    
     <input @attributes="AdditionalAttributes"
            id="@_id"
            class="@CssClass"
            value="@_stringValue"
            @onkeydown="OnKeyDown"
            @oninput="OnInput"
            @onchange="OnValueChanged" />
    
     <script>
         window.validateInput = function (inputId, allowDecimals, event) {
             const key = event.key;
    
             if (
                 (key >= '0' && key <= '9') ||
                 key === 'Backspace' ||
                 key === 'Tab' ||
                 key === 'Enter' ||
                 key === 'ArrowLeft' ||
                 key === 'ArrowRight' ||
                 (allowDecimals && (key === ',' || key === '.'))
             ) {
                 return true;  
             }
    
             event.preventDefault();
    
             return false;
         }
    
         window.formatNumberInput = function (value, allowDecimals) {
             let cleanedValue = value.replace(/[^0-9.,]/g, '');
             let parts = cleanedValue.split(',');
             let integerPart = parts[0].replace(/B(?=(d{3})+(?!d))/g, ".");
             let formattedValue = integerPart;
    
             if (allowDecimals && parts.length > 1) {
                 formattedValue += ',' + parts[1].substring(0, 2);  
             }
    
             return formattedValue;
         }
     </script>
    
     @code {
         private string _id { get; set; } = Guid.NewGuid().ToString();
         private string? _stringValue;
         private string? _currentValue;
    
         [Inject] IJSRuntime _jsRuntime { get; set; }
    
         [Parameter] public string FormatString { get; set; } = "#,##0.00";
         [Parameter] public bool AllowDecimals { get; set; } = true;
    
         protected override void OnParametersSet()
         {
             if (Value.HasValue)
             {
                 _stringValue = Value.Value.ToString(FormatString, CultureInfo.CurrentCulture);
             }
             else
             {
                 _stringValue = null;
             }
         }
    
         private async Task OnKeyDown(KeyboardEventArgs e)
         {
             await _jsRuntime.InvokeVoidAsync("validateInput", _id, AllowDecimals, e);
         }
    
         private async Task OnInput(ChangeEventArgs e)
         {
             _currentValue = e.Value?.ToString();
             var formattedValue = await _jsRuntime.InvokeAsync<string>("formatNumberInput", _currentValue, AllowDecimals);
             _stringValue = formattedValue;
    
             var rawValue = formattedValue.Replace(".", "").Replace(",", ".");
             if (decimal.TryParse(rawValue, out var result))
             {
                 Value = result;
                 await ValueChanged.InvokeAsync(Value);
             }
         }
    
         private async Task OnValueChanged(ChangeEventArgs e)
         {
             await OnInput(e);
         }
     }
    

K6 shared mutable memory

I doing some tests that require me to collect and store individual “processing ids” from calls in K6 to our backend api’s. Those id’s are required to do some other tests down the line. Our backend system doesn’t allow us to complete the 2 api requests after another because of the huge processing time that it takes to complete.

So TLDR: i fire an api request, get a processing id back. several hours later i can use that id to download a pdf file.

Now i need to store those id’s when running the first test. Is there a simple way in k6 to have some kind of “shared mutale memory” or are there other solutions that don’t involve writing thousands of individual files to store and afterwards load those ids?

React Query fetches data with 200 response but does not render it correctly

I’m building a React application using React-Query to fetch posts from an API. Although the API returns a 200 response and the data seems to be fetched correctly, I’m having trouble rendering the data in my component.

Code Overview

Here’s a simplified version of my Posts component where I make the API call:

import React, { useEffect } from 'react';
import { useQuery } from "@tanstack/react-query";
import { useNavigate } from 'react-router-dom';

const Posts = ({ feedType, username, userId }) => {
    const navigate = useNavigate();

    const getPostEndpoint = () => {
        switch (feedType) {
            case "forYou":
                return "http://example.com/api/explore/";
            case "following":
                return "http://example.com/api/home/";
            case "posts":
                return `http://example.com/api/profile/${username}/`;
            case "likes":
                return `http://example.com/api/posts/likes/${userId}/`;
            default:
                return "http://example.com/api/home/";
        }
    };

    const { isLoading, data: posts = [], error } = useQuery({
        queryKey: ["posts", feedType, username, userId],
        queryFn: async () => {
            const response = await fetch(getPostEndpoint());
            if (!response.ok) {
                throw new Error('Network response was not ok');
            }
            return response.json();
        },
    });

    useEffect(() => {
        // Fetching logic (if needed)
    }, [feedType]);

    return (
        <>
            {isLoading && <p>Loading...</p>}
            {error && <p>Error fetching posts: {error.message}</p>}
            {posts.length > 0 ? (
                posts.map(post => <Post key={post.id} post={post} />)
            ) : (
                <p>No posts available</p>
            )}
        </>
    );
};

export default Posts;

API response(Curl):

{
  "id": 13,
  "content": "hello",
  "image": null,
  "video": null,
  "voice": null,
  "music": null,
  "creation_date": "2024-09-28",
  "user": 23,
  "post_like_amount": 0,
  "post_comment_amount": 1
}

Issues Encountered

  1. Response Structure: The API returns a 200 response with the expected data structure, but I’m not sure how to verify if it’s the right format before rendering.

  2. Rendering Logic: I’ve set up the component to check for loading and error states, but it seems to not render the posts correctly even when data is fetched successfully.

  3. Data not Rendering: The component always falls into the “No posts available” message, even when I can see in the console that data is fetched successfully. Here’s a console log showing the response:

console.log('Fetched data:', posts);

What I’ve Tried

  1. I ensured the API returns the expected data format.

  2. I checked the network tab to confirm the correct API responses.

  3. I added console logs at various stages to debug, but the data still doesn’t appear.

Additional Information

I suspect there might be issues related to state management with React-Query or how I’m interpreting the fetched data. Any insights or suggestions on how to resolve this would be greatly appreciated!

Typing reusable components for object fields in react-hook-form

We’re having issues using typescript types with reusable react-hook-form components for object fields with nested inputs. In particular we have trouble with the Path type.

Eschewing reusable components and writing everything inline works, but is not a practical solution. Here’s an inline example where name is a localized object containing nested fields with translations in languages en and nb:

import { Controller, useForm, type SubmitHandler } from "react-hook-form";

type ExampleModel = {
  name: { nb: string; en: string };
};


const defaultValues = {
  name: { nb: "nb name", en: "en name" },
}

export const ExampleForm = () => {
  const { control, handleSubmit } = useForm<ExampleModel>({ defaultValues });

  const onSubmit: SubmitHandler<ExampleModel> = (data) => console.log(data);

  return (
    <Form onSubmit={handleSubmit(onSubmit)}>
      <Controller control={control} name="name.nb" render={({ field }) => <input {...field} />} />
      <Controller control={control} name="name.en" render={({ field }) => <input {...field} />} />
      <button type="submit">Submit</button>
    </Form>
  );
};

We use the controlled paradigm with Controller in order to integrate with our UI library.

We’d like to define a reusable component LocalizedInput that encapsulates the two inputs. The issue is writing types for the component props so that ${name}.nb and ${name}.en satisfies Path<Model>.

At first glance it looks like FieldPathByValue might do the trick but the error "... is not assignable to type 'Path<Model>'" persists:

import { FieldPathByValue } from 'react-hook-form';

const LocalizedInput = <Model extends FieldValues>({
  control,
  name,
}: {
  control: Control<Model>;
  name: FieldPathByValue<Model, { nb: string; en: string }>;
}) => (
  <>
    <Controller control={control} name={`${name}.nb`} render={({ field }) => <input {...field} />} />
    <Controller control={control} name={`${name}.en`} render={({ field }) => <input {...field} />} />
  </>
);

I find it a bit confusing why that doesn’t work, since the following correctly evaluates to type LocalizedPath="name":

type LocalizedPath = FieldPathByValue<ExampleModel, { nb: string; en: string }>;

Any idea how to solve this?

Issue with Redirecting After Successful Login and Signup in Next.js + Firebase with JWT Authentication

I’m using Next.js with Firebase Authentication and JWT to build a login/signup flow. Everything seems to be working fine — users can register and login successfully, and JWT tokens are generated without any issues. However, after a successful login or signup, users are not being redirected to the dashboard. Instead, they stay on the same page (either the login or signup page).

No errors related to authentication are showing up in the console, and I can verify that the JWT is being created and returned correctly from the server. The issue seems to be with how cookies are being set or handled.

1. Login Page (login.tsx)

"use client";
import React, { useState } from "react";
import { useRouter } from "next/navigation";
import firebase from "@/firebase/firebaseConfig"; // Firebase configuration
import { signInWithEmailAndPassword } from "firebase/auth";

const Login = () => {
  const [email, setEmail] = useState("");
  const [password, setPassword] = useState("");
  const router = useRouter();  // To programmatically redirect

  const handleLogin = async () => {
    try {
      const userCredential = await signInWithEmailAndPassword(
        firebase.auth(),
        email,
        password
      );

      const token = await userCredential.user.getIdToken();

      // Send token to backend for JWT generation and set cookie
      const response = await fetch("/api/auth/login", {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
        body: JSON.stringify({ token }),
      });

      if (response.ok) {
        // Assuming the backend sets the JWT cookie here
        router.push("/dashboard"); // Redirect after successful login
      } else {
        console.log("Failed to login: ", await response.json());
      }
    } catch (error) {
      console.log("Error logging in: ", error);
    }
  };

  return (
    <div>
      <h1>Login</h1>
      <input
        type="email"
        placeholder="Email"
        value={email}
        onChange={(e) => setEmail(e.target.value)}
      />
      <input
        type="password"
        placeholder="Password"
        value={password}
        onChange={(e) => setPassword(e.target.value)}
      />
      <button onClick={handleLogin}>Login</button>
    </div>
  );
};

export default Login;

2. Backend Login API (pages/api/auth/login.ts)

import { NextApiRequest, NextApiResponse } from "next";
import { verifyIdToken } from "@/utils/firebaseAdmin";
import { generateJWTToken } from "@/utils/jwt";
import cookie from "cookie";

export default async function handler(req: NextApiRequest, res: NextApiResponse) {
  if (req.method === "POST") {
    const { token } = req.body;

    try {
      // Verify Firebase token
      const decodedToken = await verifyIdToken(token);

      // Generate JWT
      const jwtToken = generateJWTToken({
        _id: decodedToken.uid,
        email: decodedToken.email,
      });

      // Set JWT in cookies
      res.setHeader(
        "Set-Cookie",
        cookie.serialize("token", jwtToken, {
          httpOnly: true,
          secure: process.env.NODE_ENV === "production",
          maxAge: 60 * 60 * 24 * 7, // 1 week
          path: "/",
        })
      );

      res.status(200).json({ message: "Login successful" });
    } catch (error) {
      res.status(500).json({ message: "Failed to authenticate" });
    }
  } else {
    res.status(405).json({ message: "Method not allowed" });
  }
}

3. Dashboard Page (dashboard.tsx)

"use client";
import React, { useEffect } from "react";
import { useRouter } from "next/navigation";
import { cookies } from "next/headers"; // Using next/headers for cookie retrieval

const Dashboard = () => {
  const router = useRouter();

  useEffect(() => {
    const token = cookies().get("token")?.value; // Attempt to read the token from cookies
    if (!token) {
      router.push("/login"); // Redirect to login if token doesn't exist
    }
  }, []);

  return <div>Welcome to the dashboard!</div>;
};

export default Dashboard;

What’s Working:

  • Firebase authentication works perfectly.
  • JWT tokens are generated on the server-side.
  • The API returns a 200 status when login/signup is successful, and the backend sets the cookie.

After successful login/signup:

  • The cookie is not being set on the client-side, which is why the dashboard is not accessible after login.
  • The cookies().get("token") is returning undefined on the dashboard, so the user is being redirected to the login page.

ServiceNow Printing lines from an array in an UI page

I have a requirement where I need to dynamically select one option from a select box depending on the information coming from the record.

The best I could come with was this:

<div class="form-group" style="width:150px;">
  <label class="col-md-4 control-label" for="language">Customer's Language</label>
  <div class="col-md-2" style="width:100px;">
    <select id="language" name="language" class="form-control">

        <g:evaluate var="jvar_ur" object="true">
            var language = 'it';
            var languages = [
            {
                "text"  : "German",
                "value" : "de"
            },
            {
                "text"     : "French",
                "value"    : "fr"
            },
            {
                "text"  : "Italian",
                "value" : "it"
            },
            {
                "text"  : "English",
                "value" : "en"
            }
            ];
            var options = [];

            for (keys in languages){
                if (language == languages[keys].value){
                    options.push('<option value="' + languages[keys].value + '" selected="selected">' + languages[keys].text + '</option>');
                }else{
                    options.push('<option value="' + languages[keys].value + '">' + languages[keys].text + '</option>');
                }
            }
            options;
        </g:evaluate>
        <j2:forEach items="${jvar_ur}" var="jvar_array_item">   
        

            "${jvar_array_item}"

        </j2:forEach> 

(bare in mind that the value of “language” is hardcoded for testing, this info will be fetched from the url once this starts to work).

My problem is that I cannot find a way of “printing” those text/html lines inside the array “options”.
I’ve found many different solutions on how to create options from an array but none of them would allow me to have a preselected option.

Can someone help me figure out what is wrong in this code and/or how to make this work?

Thanks in advance

Upload chunks to S3 Bucket cause create 2 files one 0B and one with actual size and overwrite prev files

  • Hello, I’m working in a node app that act like a proxy when uploading files to S3 Bucket one of the requirements is to reduce the latency between the user upload and the server upload + never buffer files into memory then upload it so I checked the multer code and created something similar to handle my case I tried to handle the most possible cases as possible and uploaded the chunks to bucket successfully but unfortunately after checkout the bucket it’s make 2 files one with 0B and second with the actual files size so when upload the first file and check versions I can see it as described above and when upload another file it’s overwrite the file uploaded before and move to versions and keep only the latest uploaded file although that the filename is always unique and the files never conflict with each other but I don’t know why that happens I checked the code for multiple times and I couldn’t find the mistake causing that issue here’s my code and screenshot of versions and the current available files they are should be 2 files while it keep only one and (deleted / moved) the other file to version

  • here’s the image for versions

Versions image

  • an image for exists files

existing files image

  • here’s my code
const BusBoy = require('busboy');
const appendField = require('append-field');
const createHttpError = require('http-errors');
const { Upload } = require('@aws-sdk/lib-storage');
const { DeleteObjectCommand, PutObjectCommand, AbortMultipartUploadCommand } = require('@aws-sdk/client-s3');
const onFinished = require('on-finished');
const { v4: uuidv4 } = require('uuid');
const path = require('path');
const { EventEmitter } = require('events');

class FilesCounter extends EventEmitter {
  constructor() {
    super();
    this.value = 0;
  }

  increment() {
    this.value++;
  }

  decrement() {
    if (--this.value === 0) {
      this.emit('zero');
    }
  }

  isZero() {
    return this.value === 0;
  }

  /**
   * @param {() => any} eventHandler
   * @returns {void}
   */
  onceZero(eventHandler) {
    if (this.isZero()) {
      return eventHandler();
    }

    this.once('zero', eventHandler);
  }
};

const errorMessages = {
  LIMIT_PART_COUNT: 'Too many parts',
  LIMIT_FILE_SIZE: 'File too large',
  LIMIT_FILE_COUNT: 'Too many files',
  LIMIT_FIELD_KEY: 'Field name too long',
  LIMIT_FIELD_VALUE: 'Field value too long',
  LIMIT_FIELD_COUNT: 'Too many fields',
  LIMIT_UNEXPECTED_FILE: 'Unexpected field',
  MISSING_FIELD_NAME: 'Field name missing',
  CLIENT_ABORTED: 'Client aborted',
};

class ErrorHandler extends Error {
  /**
   * @param {keyof errorMessages | { code: keyof errorMessages; seperate: string }} code
   * @param {string} field
   */
  constructor(code, field) {
    super();
    const messageCode = typeof code === 'string' ? code : code.code;

    this.name = this.constructor.name;
    this.message = errorMessages[messageCode].concat(typeof code === 'string' ? '' : ` ${code.seperate}`);
    this.code = messageCode;
    if (typeof field === 'string') {
      this.field = field;
    }

    Error.captureStackTrace(this, this.constructor);
  }
};


/**
 *  @typedef {{
    * fieldname: string;
    * originalname: string;
    * encoding: string;
    * mimetype: string;
    * filename: string;
 * }} file
 * @typedef {(filterErr: ErrorHandler | Error, allow: boolean) => any} callback
 */

module.exports = class S3UploadMiddleware {
  /**
   * @param {{
      * S3Client: import('@aws-sdk/client-s3').S3Client;
      * BUCKET_NAME: string;
      * logger?: import('winston').Logger | Console;
      * filenameGenerator?: (fileName: string) => string;
    * }} param0
   */
  constructor({ S3Client, logger, filenameGenerator, BUCKET_NAME }) {
    this.s3 = S3Client;
    this.logger = logger || console;
    this.filenameGenerator = filenameGenerator || this.generateUniqueFileName;
    this.BUCKET_NAME = BUCKET_NAME;
  }

  /**
   * @param {{
      * fields?: {name: string; maxCount: number}[],
      * limits?: import('busboy').Limits,
      * fileFilter?: (req: import('express').Request, file: Omit<import('busboy').FieldInfo, 'mimeType'> & { filename: string; mimetype: string }, cb: (err?: Error, reject?: boolean) => void) => any
   * }} options
  */
  createMiddleware(options) {
    // validate options
    if (!options.fields?.[0]?.name) {
      throw new Error('Invalid fields please specify the name of the fields you want to upload');
    }

    /**
    * @param {import('express').Request} req
    * @param {import('express').Response} res
    * @param {import('express').NextFunction} next
    */
    return async (req, res, next) => {
      // skip none form data content
      if (!this.isFormData(req.headers['content-type'])) return next();
      const ContentLength = Number(req.headers['content-length'] || 0);

      // >>>>>>>>>>>>>>>>>>>> init before create middleware <<<<<<<<<<<<<<<<<<
      // create an Map of fields names and max files limit to upload to use it in allow uploading files or skip it or aborting process
      const requiredFields = new Map((options.fields || []).map(e => [e.name, typeof e.maxCount === 'number' ? e.maxCount : Infinity]));
      const isSingleFile = (options.fields || []).length === 1 && (options.fields || [])[0]?.maxCount === 1;

      // handling filter files
      const fileFilter = options.fileFilter || ((req, file, cb) => cb(null, true));

      /**
       * wrapper to handle when field isn't exists in fields array of objects to indicate whether to skip it or continue or abort process when field max count is reached
       * @param {import('express').Request} req
       * @param {file} file
       * @param {callback} cb
       */
      const filesFilterWrapper = (req, file, cb) => {
        const filesMaxCount = requiredFields.get(file.fieldname);

        if (typeof filesMaxCount === 'undefined') {
          this.logger.warn(`Unexpected file field [${file.fieldname}] - skipping upload`);
          return cb(new ErrorHandler('LIMIT_UNEXPECTED_FILE', file.fieldname), false);
        }

        if (filesMaxCount <= 0) {
          this.logger.warn(`File limit exceeded for field [${file.fieldname}] - skipping`);
          return cb(new ErrorHandler('LIMIT_FILE_COUNT', file.fieldname), false);
        }

        // update the file limit counter
        requiredFields.set(file.fieldname, filesMaxCount - 1);
        fileFilter(req, file, cb);
      };

      // const ContentLength = req.headers['content-length'];
      // prepare request body parse form fields inside busboy 'field' event
      req.body = {};

      /**
       * @type {import('busboy').Busboy | undefined}
       */
      let busboy;

      try {
        busboy = BusBoy({ headers: req.headers, limits: options.limits });
      } catch (err) {
        this.logger.error('BusBoy init error', err);
        return next(err);
      }

      // >>>>>>>>>>>>> start handling file upload to bucket <<<<<<<<<<<<<<<<<<<

      // indicators
      let isDone = false;
      let readFinished = false;
      let errorOccured = false;

      // handlers to handle abort uploades / remove uploaded files and count the number of pending files to call next when receive zero files signal and call next
      const pendingWrites = new FilesCounter();
      /**
       * @type {string[]}
       */
      const uploadedFiles = [];
      /**
       * @type {AbortController[]}
       */
      const abortControllers = [];

      // >>>>>>>>>>>>>>>>>> initialize helpers <<<<<<<<<<<<<<<<<<<
      // cleanup / delete uploaded files
      const cleanupUploadedFiles = async slient => {
        if (!uploadedFiles.length) return;
        // Use Promise.allSettled to handle cleanup gracefully, even if some deletions fail
        const results = await Promise.allSettled(uploadedFiles.map(this.deleteObject));

        const rejectedResult = results.find(result => result.status === 'rejected');
        if (rejectedResult) {
          this.logger.error(`Failed to clean up some files: ${rejectedResult.reason}`);
          cleanupBusboy();
          if (!slient) {
            // TODO: check if the reason can be thrown
            throw rejectedResult.reason; // this.mapBucketErrorToHttpError(rejectedResult.reason);
          }
        }

        this.logger.info('uploaded files clean up success');
      };

      // function to help clean up and abort processing files
      const abortWithError = err => {
        if (errorOccured) return;
        errorOccured = true;

        // Abort all ongoing file uploads
        abortControllers.forEach(controller => controller.abort('Operation Aborted!'));

        pendingWrites.onceZero(async () => {
          cleanupBusboy();

          try {
            console.time('Clean up all files done in');
            await cleanupUploadedFiles();
            console.timeEnd('Clean up all files done in');
          } catch (cleanupErr) {
            if (cleanupErr.name === 'AbortError') {
              this.logger.info('operation aborted');
            } else {
              this.logger.error('Delete object error', cleanupErr);
              return next(cleanupErr); // this.mapBucketErrorToHttpError(cleanupErr);
            }
          }

          next(err);
        });
      };

      function cleanupBusboy() {
        req.unpipe(busboy);
        busboy.removeAllListeners();
      }

      /**
       * helper function to call next and clean up when upload is complete without errors
       * @param {*} err
       */
      function handleDone(err) {
        if (isDone) return;
        isDone = true;

        cleanupBusboy();
        next(err);
      }

      /**
       * helper function to handle complete the process when everything is done
       */
      function completeProcess() {
        // check if the process completed successfully without errors and no files is pending
        if (readFinished && pendingWrites.isZero() && !errorOccured) {
          handleDone();
        }
      }

      // >>>>>>> Client Disconnect Handling <<<<<<<
      onFinished(res, () => {
        if (!isDone) {
          this.logger.warn('Client disconnected, aborting remaining uploads');
          abortWithError(new ErrorHandler('CLIENT_ABORTED'));
        }

        console.log('Client disconnected, no need to abort');
      });

      // >>>>>>>>>>>>>>>>>> handling busboy events <<<<<<<<<<<<<<<<<<<
      busboy.on('file', (fieldname, fileStream, info) => {
        const { filename, encoding, mimeType } = info;
        if (!fieldname || !filename) return fileStream.resume(); // Skip if no file

        /**
         * init the file object to pass the file info to next function
         * @type {file}
         */
        const file = { fieldname, originalname: filename, encoding, mimetype: mimeType };

        // Pause the file stream before processing to prevent uncontrolled flow
        fileStream.pause();

        // handle filter files before start uploading to bucket
        filesFilterWrapper(req, file, (filterErr, allow) => {
          // always make error messages on top to avoid sening user to next handler when there's a validation error occured
          if (filterErr) return abortWithError(filterErr);
          if (!allow) return fileStream.resume(); // Skip disallowed files

          // create a file unique name to avoid files overwrite / names conflict
          file.filename = this.filenameGenerator(filename);
          // save the file name to uploadedFiles array to handle abort / cleanup when and error occurs or process aborted
          uploadedFiles.push(file.filename);

          // check if it's a single file to pass it to req.file or multiple files to pass to req.files instead
          if (isSingleFile) {
            req.file = file;
          } else {
            req.files = (req.files || []).concat([file]);
          }

          // create abortControllder to handle abort process and cleanup
          const abortController = new AbortController();
          const controllerIndex = abortControllers.length;
          abortControllers.push(abortController);

          // call counter to send signature there's a file started to upload
          pendingWrites.increment();
          // Resume file stream once checks are complete
          fileStream.resume();

          console.log({ ContentLength, ContentType: file.mimetype, ContentEncoding: file.encoding });
          // let currentUploadId;

          this.uploadUnknowLengthStreamObject(fileStream, file.filename, {
            config: { signal: abortController.signal, tags: [{ Key: file.filename, Value: file.originalname }] },
            params: {
              ContentLength: ContentLength > this.BytesConverter.MBToBytes(5) ? ContentLength : undefined, // allow to upload files less than 5MB by setting ContentLength as undefined
              ContentEncoding: file.encoding,
              ContentType: file.mimetype || 'application/octet-stream',
            },
          })
            // this.uploadObject(fileStream, file.filename, {
            //   sendOptions: { abortSignal: abortController.signal },
            //   paramsOptions: {
            //     /* ContentLength,  */
            //     ContentEncoding: file.encoding,
            //     ContentType: file.mimetype || 'application/octet-stream',
            //     Tagging: `${file.filename}=${file.originalname}`,
            //   },
            // })
            .then(result => {
              this.logger.log(JSON.stringify(result, null, 2), '............. result .......................');
              this.logger.info(`File [${filename}] uploaded successfully`);
              // call decrement to indicate there a file has been uploaded
              pendingWrites.decrement();
              abortControllers.splice(controllerIndex, 1); // remove the controller from abortControllers array when successfully uploaded
              // call complete process to go next when zero files is pending
              completeProcess();

              // handle case where uploaded file size is smaller than 5MB to remove it after upload done
              if (errorOccured) {
                cleanupUploadedFiles();
              }
            })
            .catch(err => {
              // call decrement to indicate none any files hanging.
              pendingWrites.decrement();
              abortControllers.splice(controllerIndex, 1); // remove the controller from abortControllers array when done.
              if (err.name === 'AbortError') {
                this.logger.info('operation aborted');
              } else {
                this.logger.error('File upload error', err);
                // handle abort error
                abortWithError(err);
              }
            });

          fileStream.on('error', err => {
            // remove the pending file when an error occured
            pendingWrites.decrement();
            abortWithError(err);
          });

          fileStream.on('limit', () => {
            // remove the pending file limit exceeded
            pendingWrites.decrement();
            abortWithError(
              new ErrorHandler(
                options.limits.fileSize
                  ? { code: 'LIMIT_FILE_SIZE', seperate: `max size allowed is: ${this.BytesConverter.BytesToKB(options.limits.fileSize)}KB` }
                  : 'LIMIT_FILE_SIZE',
                fieldname,
              ),
            );
          });

          fileStream.on('close', () => {
            // completeFileUpload(); // Local file stream finished
            this.logger.info(`Finished processing file ${filename}`);
          });
        });
      });

      // handle parse formData and complex formData syntax like (pets[0][name] = value) then append it to req.body
      busboy.on('field', (name, value, info) => {
        if (!name) return abortWithError(new ErrorHandler('MISSING_FIELD_NAME'));
        const { nameTruncated, valueTruncated } = info;
        if (nameTruncated) return abortWithError(new ErrorHandler('LIMIT_FIELD_KEY'));
        if (valueTruncated) return abortWithError(new ErrorHandler('LIMIT_FIELD_VALUE', name));

        appendField(req.body, name, value);
      });

      // handling busboy events
      busboy.on('error', err => {
        busboy.removeAllListeners();
        abortWithError(err);
      });

      busboy.on('partsLimit', () => {
        abortWithError(new ErrorHandler('LIMIT_PART_COUNT'));
      });

      busboy.on('filesLimit', () => {
        abortWithError(new ErrorHandler('LIMIT_FILE_COUNT'));
      });

      busboy.on('fieldsLimit', () => {
        abortWithError(new ErrorHandler('LIMIT_FIELD_COUNT'));
      });

      busboy.on('close', () => {
        this.logger.info('busboy closed!');
      });

      busboy.on('finish', () => {
        readFinished = true;
        completeProcess();
      });

      req.pipe(busboy);
    };
  }

  BytesConverter = {
    /**
     * convert MB number to bytes
     * @param {number} n
     * @returns {number}
     */
    MBToBytes: n => n * (1024 ** 2),

    /**
     * convert bytes number to KB
     * @param {number} n
     * @returns {number}
     */
    BytesToKB: n => n / (1024 ** 1),

    /**
     * convert KB number to bytes
     * @param {number} n
     * @returns {number}
     */
    KBToBytes: n => n * (1024 ** 1),

    /**
     * convert bytes number to MB
     * @param {number} n
     * @returns {number}
     */
    BytesToMB: bytes => bytes / (1024 ** 2),
  };

  /**
   * check if content type header is form data
   * @param {string} str
   * @returns {boolean}
   */
  isFormData = str => str.startsWith('multipart/form-data');

  /**
  * create a file uniqueName
  * @param {string} originalname
  * @returns {string}
  */
  generateUniqueFileName(originalname) {
    const filename = uuidv4();
    const timestamp = Date.now();
    const name = originalname ? Buffer.from(originalname, 'latin1').toString('utf8') : 'file.png';
    return `${filename}--${timestamp}${path.extname(name)}`;
  }

  /**
  * Function to map bucket delete files errors to HTTP errors
  * @param {import('@aws-sdk/client-s3').S3ServiceException} err
  */
  mapBucketErrorToHttpError(err) {
    // If the bucket error has a specific status code or type, map it to HTTP errors
    if (err.$metadata.httpStatusCode) {
      // You can adjust the mappings based on the bucket service's status codes and error structure
      switch (err.statusCode) {
        case 401:
          return createHttpError(401, 'UnAuthorized: Access is denied');
        case 403:
          return createHttpError(403, 'Forbidden: Access is denied');
        case 404:
          return createHttpError(404, 'Not Found: The requested file does not exist');
        case 500:
          return createHttpError(500, 'Internal Server Error: service failure');
        default:
          return createHttpError(err.$metadata.httpStatusCode, err.message || 'operation failed');
      }
    }

    // For other types of errors or unknown status codes, return a generic 500 error
    return createHttpError(500, err.message || 'An unknown error occurred during the bucket operation');
  }

  /**
   * delete file from bucket
   * @param {string} fileName
   */
  deleteObject = async fileName => {
    try {
      const params = {
        Bucket: this.BUCKET_NAME,
        Key: fileName,
      };

      return this.s3.send(new DeleteObjectCommand(params));
    } catch (err) {
      this.logger.error('Error Delete Object', err);
      throw err;
    }
  };

  /**
   * stream upload unknown length file to bucket
   * @param {import('@aws-sdk/client-s3').PutObjectCommand['input']['Body']} fileStream
   * @param {string} fileName
   * @param {{config?: import('@aws-sdk/lib-storage').Configuration & { signal: AbortSignal }, params?: Omit<import('@aws-sdk/lib-storage').Options['params'], 'Body' | 'Key' | 'Bucket'> }} options
   */
  uploadUnknowLengthStreamObject = (fileStream, fileName, options = {}) => {
    try {
      /**
       * @type {import('@aws-sdk/lib-storage').Options['params']}
       */
      const params = {
        Bucket: this.BUCKET_NAME,
        Key: fileName,
        Body: fileStream,
        ACL: 'public-read', // make file publicly accessible
        ...(options.params || {}),
      };

      const upload = new Upload({
        client: this.s3, // Your S3 client instance
        params,
        queueSize: 2, // Parallelism
        // partSize: this.BytesConverter.KBToBytes(50), // 50KB part size to reduce chunk size
        ...(options.config || {}),
      });

      if (options.config?.signal) {
        options.config.signal.onabort = function () {
          console.log('............ upload aborted ..............');
          upload.abort();
        };
      }

      upload.on('httpUploadProgress', progress => {
        console.log(`Uploaded ${progress.loaded} of ${progress.total} bytes`);
      });

      return upload.done();
    } catch (err) {
      this.logger.error('Error Uploading Files', err);
      throw err;
    }
  };


  /**
   * @typedef {Parameters<typeof this.s3.send>[1]} sendOptions
   * @param {import('@aws-sdk/client-s3').PutObjectCommand['input']['Body']} fileStream
   * @param {string} fileName
   * @param {{
      * sendOptions: sendOptions,
      * paramsOptions: Omit<PutObjectCommand['input'], 'Bucket' | 'Key' | Body>
   * }} options
   */
  uploadObject = async (fileStream, fileName, options) => {
    try {
      /**
       * @type {PutObjectCommand['input']}
       */
      const params = {
        Bucket: this.BUCKET_NAME,
        Key: fileName,
        Body: fileStream,
        ...(options?.paramsOptions || {}),
        ACL: 'public-read', // make file publicly accessible
      };

      return this.s3.send(new PutObjectCommand(params), options?.sendOptions);
    } catch (err) {
      this.logger.error(err);
      throw err;
    }
  };
};

  • here’s the usage
const { s3 } = require('.');
const { BUCKET_NAME } = require('../config');
const S3UploadMiddleware = require('../lib/S3UploadMiddleware');

const s3Upload = new S3UploadMiddleware({
  logger: console,
  S3Client: s3,
  BUCKET_NAME,
});

const imagesWhiteList = ['image/png', 'image/jpeg', 'image/jpg', 'image/webp'];
const imageUpload = s3Upload.createMiddleware({
  limits: { fileSize: s3Upload.BytesConverter.MBToBytes(10) },
  fields: [{ name: 'image', maxCount: 1 }],
  fileFilter(req, file, cb) {
    // Accept images only from white list
    if (imagesWhiteList.indexOf(file.mimetype) > -1) {
      cb(null, true); // Accept the file
    } else {
      cb(new Error(req.t('INVALID_MIME_TYPE', { types: imagesWhiteList.map(e => e.split('/')[1]).join(', ') })), false); // Reject the file
    }
  },
});
  • I hope if someone can help me fix it thanks

Is there a way to wait for response from a function before sending a response to my frontend?

I am building a logic to upload videos to cloudinary using node.js, next.js and multer package.
I have a mongodb document that I would like to store the video url and publicid of the video once cloudinary upload function completes.
This means that I would like to wait for the process to complete, sore it as part of my response before creating the document.

Unfortunately, I am unable to get this to work. The video upload takes time to complete. I would like to wait for the response to decide what to send back to the client.
Here is the cloudinary upload function:

export const largeFileUploader = async(props: {targetedFile: string})=>{
const {targetedFile} = props;

try {
   cloudinary.uploader.upload_large(targetedFile, 
    { resource_type: "video" , 
        chunk_size: 6000000, folder: 'folderName', upload_preset: 'presetName' }, 
    function(error, result) {
        console.log(error, 'error')
        if(error) throw 'Error encountered';
        
        return result
    });  
} catch (error) {
    console.log(error)
   throw 'file upload error';
};

};

I wrote another function that can be invoked in a route:

const imageUploadFunct = async(req: Request, res: Response, eventType: events)=>{   
    const files = req?.files as Express.Multer.File[];
    const imageUrl: string[] = [];
    const publicId: string[] = [];

  if(eventType === 'video_upload_event'){
    const uploadResponse = await largeFileUploader({targetedFile: file?.path}) as any;

    imageUrl.push(uploadResponse?.secure_url);
    publicId.push(uploadResponse?.public_id);

      if(file?.path){
         removeFile({filePath: file?.path});
       };
     return { imageUrl, publicId };
  }
};

Then in my controller, I have this:

export const httpUploadVideo = async (req: Request, res: Response)=>{

    try {
        const response = await imageUploaderFunc(req, res, 'video_upload_event');
        return res.status(200).json(response)
    } catch (error) {
        return res.status(500).json('Something went wrong, refresh your page or contact support');
    };

};

The cloudinary upload method takes time to complete the video upload before returning with response. Before then, my client already got response from me. Now, I need the cloudinary response to send to my client but before it can be ready, my code has already returned and sent response to client.
How do I make my code to wait until cloudinary returns a response before I send a response to the client?

I know there can be option for me to update the mongoose database background wise when cloudinary sends a a feedback by building something like an event hook. But I dont really wish to follow that path.