Is it recommended in TypeScript to define event handler types separately and use them with const functions? [closed]

When defining functions in TypeScript, is the following approach — defining event types in a separate types.ts file and then declaring functions as const with that type — considered a recommended practice in modern development?

Example Code (feels a bit cumbersome to me)

types.ts:

export type MouseEventHandler = EventHandler<MouseEvent>;

event-listeners.ts:

export const onClickCreateItem: MouseEventHandler = async (event, deps) => {
  // some logic
};

Current Code (what I usually do)

event-listeners.ts:

export async function onClickCreateItem(
  event: MouseEvent,
  deps: CreateItemDependencies
) {
  // some logic
}

Notes

  • I’m fairly new to TypeScript, and my impression is: “Wait, do I really need to set up this much boilerplate just for types?”
  • If the answer is basically “Yes, that’s just how TypeScript is, and it’s considered good practice,” then I’ll happily follow that approach.
  • I asked ChatGPT, but its answers vary depending on how I phrase the question — sometimes it says one approach is recommended, sometimes the other. So I’d love to hear opinions from real developers here.

Thanks in advance!

How to join (+2347063372861%% I want to join occult for money ritual

MERCURIOUS CONFRATERNITY

Mercurious occult Member +2347063372861

WELCOME TO THE WORLD OF BILLIONAIRE MERCURIOUS CONFRATERNITY OCCULT SOCIETY WHERE TO ACHIEVE ALL YOUR DESIRE IN LIFE, JOIN US NOW AND BE FREE FROM POVERTY AND PAINS, WE ARE HERE TO CHANGE YOU FROM BAD TO GOOD ONCE YOU HAVE THE MIND TO DO WHAT IT TAKE TO MAKE WEALTH AND FORTUNES CALL +2347063372861 NOW! Do you want to be a member of MERCURIOUS OCCULT as a brotherhood that will make you rich and famous in the world and have power to control people in the high place in the worldwide .Are you a pastor, business man or woman, artist, politician, musician, student, do you want to be rich, famous, powerful in life, join the Mercurious fraternity cult today and get instant rich sum of 250 million naira in a month, and a free home.

BENEFITS GIVEN TO NEW MEMBERS WHO JOIN MERCURIOUS

  1. A Cash Reward of USD $800,000 USD

  2. A New Sleek Dream CAR valued at USD $500,000 USD

3.A Dream House bought in the country of your own choice

  1. One Month holiday (fully paid) to your dream tourist destination.

  2. One year Golf Membership package

  3. A V.I.P treatment in all Airports in the World

  4. A total Lifestyle change

  5. Access to Bohemian Grove

  6. Monthly payment of $1,000,000 USD into your bank account every month as a member

  7. One Month booked Appointment with Top 5 world Leaders and Top 5 Celebrities in the World.

If you are interested of joining us in the great brotherhood of BILLIONAIRES MERCURIOUS Occult temple CALL NOW FOR ENQUIRIES +2347063372861

HAVE YOU YOU BEEN SEEKING FOR OPPORTUNITIES TO JOIN A SACRED BROTHERHOOD OCCULT FOR MONEY RITUAL IN NIGERIA? This is your time to make a positive change in your life, in as much as you have the bravery and courage to withstand the difficult parts of this brotherhood. +2347063372861, Every member of this Society is entitled to all the Secret knowledge of Spiritual wealth and power, money is assured, power is assured, fame is assured if you want it, protection is also assured. But have it in mind that becoming a member of this occult you’re to perform a great sacrifice to please the Lord Spiritual and all your heart desires will be granted. Call now for enquiries +234706337286 BILLIONAIRES MERCURIOUS OCCULT IS A SACRED FRATERNITY WITH A GRAND LODGE TEMPLE SITUATED IN ENUGU STATE NIGERIA, OUR NUMBER ONE OBLIGATION IS TO MAKE EVERY INITIATE MEMBER HERE RICH AND FAMOUS IN OTHER RISE THE POWERS OF GUARDIANS OF AGE+. +2347063372861..

Is it recommended in TypeScript to define event handler types separately and use them with const functions?

When defining functions in TypeScript, is the following approach — defining event types in a separate types.ts file and then declaring functions as const with that type — considered a recommended practice in modern development?

Example Code (feels a bit cumbersome to me)

types.ts:

export type MouseEventHandler = EventHandler<MouseEvent>;

event-listeners.ts:

export const onClickCreateItem: MouseEventHandler = async (event, deps) => {
  // some logic
};

Current Code (what I usually do)

event-listeners.ts:

export async function onClickCreateItem(
  event: MouseEvent,
  deps: CreateItemDependencies
) {
  // some logic
}

Notes

  • I’m fairly new to TypeScript, and my impression is: “Wait, do I really need to set up this much boilerplate just for types?”
  • If the answer is basically “Yes, that’s just how TypeScript is, and it’s considered good practice,” then I’ll happily follow that approach.
  • I asked ChatGPT, but its answers vary depending on how I phrase the question — sometimes it says one approach is recommended, sometimes the other. So I’d love to hear opinions from real developers here.

Thanks in advance!

Detect browser support for “closedby” attribute

The closedby attribute is supported in all major browsers except Safari. For those browsers, setting closedby="all" on a dialog element enables light dismiss, including clicking on dialog::backdrop.

I tried to implement feature detection for Safari as follows (this is inside a click event listener):

// modal.close() is handled by `closedby` attribute on <dialog>
// except in Safari.
if (`closedby` in navigator) {
  console.log(`closedby supported`);
} else {
  modal?.close();
  console.log(`closedby not supported`);
};

But even on supporting browsers, the console message is closedby not supported.

How can I detect support for closedby?

Note: modal is a variable holding the dialog element.

How to integrate Paypal with mern stack, the correct way?

State machine (definitive)

cartreserved (or pending) → payment_in_progresspaidfulfilled

Failures / alternatives: reservedexpiredcancelled (stock returned); payment_in_progressfailedcancelled (release stock or keep for manual review); paidrefunded (store refund events).

Make allowed transitions explicit in code to prevent invalid state changes.


Global references you must read first (most important)

  • PayPal Orders API (create / capture orders) — read fully.
  • PayPal JS SDK reference (createOrder / onApprove) and checkout flow examples.
  • PayPal Webhooks & verify signature guide.
  • MongoDB Aggregation Pipeline (for analytics).
  • MongoDB Transactions (production constraints: replica sets).
  • MongoDB Text Index / Search docs (product search).
  • Redux Toolkit docs (slices, configureStore, thunks).
  • Express.js docs (routing / middleware best practices).

Use the above pages while implementing each corresponding module


Module-by-module implementation (algorithms + exact references to read)

A — Backend core (Express + DB)

Read before coding: Express guide (routing & middleware).

1) Project skeleton & middleware

Algorithm (no code):

  1. Create Express app and structured router directories: /routes, /controllers, /models, /services, /utils.

  2. Add JSON body parser, CORS (restrict origins in prod), Helmet for headers, rate limiter for sensitive endpoints, error handler middleware.

  3. Load .env and validate presence of required keys (MONGO_URI, PAYPAL_CLIENT_ID, PAYPAL_SECRET, WEBHOOK_ID).

    Where to read:

  • Express docs for middleware and error handling.

2) DB connection & transactions

Algorithm:

  1. Connect to MongoDB — ensure it is a replica set to enable transactions (Atlas or single-node replica). If not possible, fall back to atomic findOneAndUpdate + compensating rollback.
  2. Expose a helper to start sessions and transactions for multi-document work (order creation + stock updates). If session/transaction creation fails (standalone), use per-item atomic updates with a compensating rollback algorithm and tight logging.

Read:

  • MongoDB production transactions doc.

3) Models & indexes (MongoDB)

Design checklist (fields):

  • products: _id, name, description, category, price, currency, stock, imageUrl, createdAt, updatedAt
    • Create text index on name + description for search.
    • Additional indexes: category, price, stock (for low stock queries).
  • orders: _id, orderNumber, items[{productId, qty, priceAtPurchase}], totalAmount, currency, customer{}, status, paypal:{orderId,captureId,raw}, createdAt, expiresAt, paymentEvents[]
  • counters or use a sequence generator for orderNumber.

Read:

  • MongoDB indexes, text index docs.

4) Product endpoints

Algorithm:

  • GET /products:
    1. Parse q, category filters, price range, sort, page, limit.
    2. Use $text search if q present; project score. Apply filters; send pagination meta. (If you need better search later, use Atlas Search.)
  • GET /products/:id: validate id, return product details.

Read:

  • MongoDB text search / aggregation basics.

5) POST /orders — create + reserve stock (core corrected algorithm)

This is critical — follow exactly.

Algorithm:

  1. Validate request (items array and user details). Do not trust client totals.
  2. Recompute totals server-side by fetching current price for each product and summing price * qty. Store priceAtPurchase per item. (This prevents price manipulation.)
  3. Start a transaction session (if replica set available). Inside transaction:
    • For each item run an atomic conditional update: findOneAndUpdate({_id: productId, stock >= qty}, {$inc: {stock: -qty}}). If any update fails (not enough stock), abort transaction and return which items are out of stock.
    • Create order document with status = "reserved", expiresAt = now + RESERVATION_TTL (e.g., 15 minutes). Store orderNumber (get next from counters via atomic $inc).
  4. Commit transaction; return orderId + expiresAt.
  5. If transactions unavailable:
    • For each item findOneAndUpdate with stock >= qty. Track successful updates. If any fail, do compensating increments for previously decremented products (careful to log failures and retry). Create order only after all decrements succeed. (This is less safe than transactions but workable.)

      Why this corrected approach:

  • Reserving stock at order creation prevents race where payment capture later would succeed for an out-of-stock item. This requires atomicity across multiple product updates — hence transactions are recommended.

6) POST /paypal/create-order (server create PayPal order)

Algorithm:

  1. Client sends internal orderId. Server re-fetches order and recomputes/validates amounts.
  2. Call PayPal Orders API server-side to create a PayPal order with the authoritative amount and line items. Save returned paypal.orderId in orders.paypal.orderId and update order status = "payment_in_progress". Return paypal.orderId to client. (Do not let client create PayPal order with client-side totals.)

7) Capture flow (recommended server capture + webhook fallback)

Algorithm (server capture endpoint):

  1. Client gets approval via PayPal JS UI and calls server POST /paypal/capture with orderId (internal) & paypalOrderId.
  2. Server calls PayPal Orders capture endpoint server-side, verifies:
    • captured amount equals expected order.totalAmount and currency matches,
    • payer status is valid.
  3. If success:
    • Update orders.status = "paid", store captureId and raw response in orders.paypal.
    • Create fulfillment actions (email invoice, update analytics).
  4. If capture fails due to network/timeouts: mark payment_in_progress and queue retry; DO NOT mark paid unless verified.

Webhook redundancy:

  • Implement POST /webhooks/paypal to accept PayPal events (e.g., PAYMENT.CAPTURE.COMPLETED) and verify each webhook using PayPal signature verification or verify endpoint. Use idempotent processing (store PayPal event id).

Read:

  • PayPal Orders API + capture docs and JS SDK.

8) Webhooks (precise)

Algorithm:

  1. Receive event; verify with PayPal using verify-webhook-signature or equivalent; reject if verification fails.
  2. Check event idempotency store (if processed, return 200).
  3. Find related order by paypal.orderId or by captureId in payload. Re-verify amounts if necessary via server call to PayPal.
  4. Update orders.status accordingly (e.g., paid), store raw event, and ack.

B — Stock expiry & reconciliation worker (must do)

Algorithm (no code):

  1. Run background job every N minutes (e.g., every 1–5 minutes). Query orders where status = "reserved" and expiresAt < now.
  2. For each expired order:
    • Start transaction (if possible) and increment products.stock by each reserved qty (reverse the decrements) and update orders.status = "expired" / cancelled. Store a release event in order.paymentEvents.
    • If transaction not available, perform safe per-item increment with retry and update order status.
  3. Notify user via email or show message on retry.

Why: TTL deletes will not restore stock; TTL is for deletion only — so we implement this worker to release reserved stock reliably.


C — Frontend (React + Redux Toolkit)

Read before coding: Redux Toolkit usage & React fundamentals.

1) State design (Redux slices)

Logical slices:

  • productsSlice — search results, product details, loading/errors.
  • cartSlice — cart items, computed totals, persist to localStorage (rehydrate on load).
  • orderSlice — create order (reservation), track orderId, expiresAt, order status.
  • uiSlice — global toasts / loaders.

Algorithm for checkout UI flow:

  1. User clicks checkout → open Checkout form page. Validate with react-hook-form or plain validation.
  2. On submit, call POST /orders (server will compute totals and reserve stock). If server returns stock error, update cart & show item specific messages.
  3. If order reserved, call POST /paypal/create-order to get paypal.orderId (server side), then place PayPal button on page using PayPal JS SDK with createOrder returning server order id (or pass paypalOrderId to SDK). On approval, call server POST /paypal/capture.
  4. Show order/:orderNumber status page with polling or server push to show paid once webhook processed.

Read:

  • PayPal JS SDK createOrder/onApprove flow.

2) UX pitfalls & fixes

  • Show reservation expiry countdown to user (from expiresAt) so user knows time left.
  • Handle server rejection on order creation (stock changed) gracefully — update UI and let user continue.
  • Persist cart in localStorage and allow merging with server cart if user logs in (optional).

D — Analytics & aggregation (MongoDB)

Algorithm (analytics endpoint):

  1. For sales by category:
    • Match orders.status = "paid", date range filter.
    • $unwind items, $group by items.category (or product ref) summing qty and revenue = qty * priceAtPurchase.
    • $sort by revenue/qty, $limit. Use $project to shape results. (This is standard aggregation pipeline use.)

Read:

  • Aggregation pipeline overview & stage reference.

E — Security, validation & infra

Security checklist (read & implement)

  • Validate + sanitize all request bodies (Joi / AJV or manual).
  • Use Helmet, set secure CORS, use rate limiting on endpoints (esp POST /orders, /paypal/*, login).
  • Use HTTPS in production; store secrets in env.
  • Verify PayPal webhooks signatures; store processed event IDs to avoid duplicates.

Infra

  • Use MongoDB Atlas (supports transactions) or configure single-node replica set for local testing.
  • Host Node backend on Heroku/Render/VPS — ensure stable URL for PayPal webhooks. Add health checks and logging.


Testing checklist (how to validate correctness)

  1. Unit tests: price calculation, order total recomputation, stock decrement/increment logic.
  2. Integration tests: multi-item order create with concurrent requests for last unit — ensure only one reserves. (Simulate concurrency.)
  3. End-to-end: Use PayPal sandbox to create and capture; also simulate webhook calls from PayPal sandbox or via Postman. Verify server receives and verifies webhook and marks order paid.
  4. Failure tests: network failures during capture (server retry logic), webhook duplicate handling, reservation expiry worker behavior.
  5. Analytics tests: seed sample paid orders and validate aggregation results.

Quick “what to read for each module” (one-line links to docs)

  • Express.js basics & error handling.
  • MongoDB transactions & replica set constraints.
  • MongoDB aggregation pipeline (analytics).
  • MongoDB text index & search best practices (product search).
  • PayPal Orders API (create/capture) — server integration.
  • PayPal JavaScript SDK (createOrder/onApprove) — client integration.
  • PayPal Webhooks & verification best practices.
  • Redux Toolkit usage & configureStore/tutorial.

Final, exact recommended sequence (refined & safe)

  1. Repo + env skeleton + docs references (above).
  2. Backend skeleton (Express, middleware, DB connect to a replica set or Atlas).
  3. Implement products endpoints + seed DB + text index.
  4. Implement counters & order model.
  5. Implement POST /orders with transactional reserve algorithm (or safe fallback).
  6. Implement POST /paypal/create-order (server), integrate PayPal Orders API.
  7. Implement POST /paypal/capture server capture + POST /webhooks/paypal with signature verification + idempotency.
  8. Implement reservation expiry worker and reconciliation.
  9. Frontend basic build (React + Redux Toolkit) and connect to backend flows.
  10. End-to-end tests with PayPal sandbox; fix race conditions.
  11. Demo video following earlier checklist.


Phase 1: Backend Setup

  1. Initialize Project

    • npm init -y, install: express, mongoose, cors, dotenv, express-validator, paypal-rest-sdk (or official PayPal SDK).

    • Setup folder structure:

      /backend
        /models
        /routes
        /controllers
        /config
      
      
  2. Configure MongoDB

    • Use mongoose.connect(process.env.MONGO_URI).
    • Ensure indexes for product name/description.
  3. Create Models

    • Product Model: name, price, category, stock, imageURL, description.
    • Order Model: customer info, items (with product references), status, PayPal transaction ID.
  4. Core APIs

    • /api/products → list, filter, search (with regex or $text index).
    • /api/products/:id → single product.
    • /api/orders → create order (validations).
    • /api/orders/:id → get order details.
    • /api/paypal/create-order → initiate PayPal order.
    • /api/paypal/capture-order → confirm payment + update order status.
    • /api/analytics → MongoDB aggregation (sales by category, top sellers, low stock).
  5. Middleware

    • Error handler for 400/500 responses.
    • Validation middleware (express-validator).

Phase 2: Frontend Setup

  1. React App Setup
    • npx create-react-app frontend
    • Install: react-router-dom, @reduxjs/toolkit, react-redux, axios.
  2. Redux Setup
    • Store structure:

      /frontend/src
        /store
          store.js
          productSlice.js
          cartSlice.js
          orderSlice.js
      
      
    • Slices:

      • productSlice → fetch, search, filter.
      • cartSlice → add/remove/update items, persist in localStorage.
      • orderSlice → create order, track status.
  3. Components
    • Header → nav + cart icon.
    • ProductList → grid of product cards.
    • ProductCard → image, name, price, add-to-cart.
    • Cart → items, quantity control, summary.
    • Checkout → form for customer info, order summary.
    • PayPalButton → integrate SDK.
    • OrderConfirmation → display after payment.

Phase 3: Data Flow & Integration

Correct Flow:

  1. User → sees product list (from /api/products).
  2. User → adds items to cart (Redux → localStorage).
  3. User → goes to checkout:
    • Enter info → create order (/api/orders).
  4. App → call /api/paypal/create-order → PayPal popup opens.
  5. On success → PayPal returns transaction ID → call /api/paypal/capture-order.
  6. Backend → updates order status → return success response.
  7. Frontend → show confirmation page.

Phase 4: Testing

  • API testing: use Postman (products, orders, PayPal flow, analytics).
  • Frontend testing: ensure cart persists after refresh, errors handled gracefully.
  • Payment testing: use PayPal Sandbox test card.

Phase 5: Demo Preparation

  1. Walkthrough Flow:
    • Product search/filter.
    • Cart → Checkout → Payment → Confirmation.
    • Show analytics API in Postman.
  2. Redux DevTools: show cart & order slices updating.
  3. Explain Code Structure: highlight models, controllers, slices.

References (for each module)


Corrections to Flow

  • Always create order in DB before PayPal payment → ensures rollback if payment fails.
  • Cart must persist in localStorage → prevents losing items on refresh.
  • Analytics should be separate endpoint (not mixed in main APIs).
  • Never directly trust PayPal frontend response → always verify in backend with PayPal SDK.

Next.js 15 Pages Router – Dynamic routes showing root page on refresh (static export)

I upgraded an app to NextJS v15 and my dynamic routes are broken after npm run build with output: "export".

When I refresh the browser on a dynamic route like /projects/123, the root page renders instead of my [id].js component. The URL stays correct but the wrong page loads.

Setup

  • Next.js 15 with Pages Router (not App Router)
  • output: "export" in next.config.js
  • Dynamic route: pages/projects/[id].js
  • Hosting: Netlify (but seems to happen with any static server)

What happens

  1. Navigate to /projects/123 via Link – ✅ works fine
  2. Refresh the page – ❌ shows homepage content (URL still shows /projects/123)

Workaround I found

Had to add this to _app.js to force the router to recognize the route:

useEffect(() => {
  const path = window.location.pathname;
  if (path !== '/' && router.pathname === '/') {
    router.replace(path);
  }
}, []);

Unfortunately, the above hack breaks 404 routing because every route is considered valid. It’s odd to me that this worked fine in Next v13, but now I can’t find documentation that it was actually ever supported.

PHP script (HTML) execute fast but the results in the browser window takes up to 60 seconds

I have 2 very simular PHP scripts on the same server. The HTML part of this scripts are totally the same. Only the PHP subroutines are quite different.
Both scripts running in parallel in the same browser window, One is executing in the normal quick way. The other is suddenly very slow it takes up to 60 seconds.
I use microtime to messure the executing time and find out that whole execution time of the php script is only around 0.1 second.
But the big diffrence of the whole display time is on PC, Notebook and Smartphone always the same.
I have no idea what the reason could be.
Here the HTML part of my scripts.

<!DOCTYPE html>
<html lang="de">
<head>
<title>CurrentListeners</title>
    <meta http-equiv="content-type" content="text/html" />  <meta charset="UTF-8" />
    <meta http-equiv="pragma" content="no-cache">
    <meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" />
    <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
    <meta http-equiv="Expires" content="0" />   
    <link href="https://cdn.jsdelivr.net/npm/[email protected]/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-1BmE4kWBq78iYhFldvKuhfTAU6auU8tT94WrHftjDbrCEXSU1oBoqyl2QvZ6jIW3" crossorigin="anonymous">
    <link rel="stylesheet" type="text/css" href="showListeners.css">
</head>
<body>
        <?php echo getHTML(); ?>
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
    <script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/js/bootstrap.bundle.min.js" integrity="sha384-ka7Sk0Gln4gmtz2MlQnikT1wXgYsOg+OMhuP+IlRH9sENBO0LRn5q+8nbTov4+1p" crossorigin="anonymous"></script>
    <script type="text/javascript" src="showListeners.js"></script>
</body>
</html>

`

uuuuuh… what’s wrong with my perlin noise?

It’s just being weird:
https://editor.p5js.org/Argumentative/sketches/dJydzUbU_

I’ve followed multiple papers on the subject but it’s taken quite a bit of logic on my part to get something relatively similar to perlin noise. I get the feeling I’ve just matched up the corners to the equation incorrectly. The paper I’m basing this all off of is from the University of Maryland:
https://www.cs.umd.edu/class/spring2018/cmsc425/Lects/lect13-2d-perlin.pdf
NOTE:
I have no interest in anything other than 2D perlin noise.

Real-time audio translator extension for chrome with local translation service

I want to develop a real-time audio translator extension as instance for chrome and using a local translation service. The services provides an API to pass an audio snippet in .wav or .mp3 format and returns the translation as string. For example, to translate a youtube video, the audio has to pass to the API and the returned text should be shown in the video as subtile.

API service

Input:

  1. source language – default english
  2. audio file – .wav || .mp3
  3. target language – default german

How to I implement a browser plugin that sends audio chunks (3 or 5 seconds long) to the API service and show the returned translated text as subtitle in real time to show the video?

The main challenge is to capture the audio, chunk it and send to the API. I tried to use MediaRecoder but faces a lot errors and I don’t if this the right approach.
A good minimal example would help a lot.

Firebase Storage CORS error persists despite correct gsutil configuration and cache clearing

I’m developing a simple web app using HTML and vanilla JavaScript. The app allows users to upload an image, which is then sent to Firebase Storage. My web app is hosted at https://esstsas.net.

The Problem:

When I try to upload an image to my Firebase Storage bucket (gs://checksst-faster.firebasestorage.app), the request fails with a CORS preflight error in the browser console:

Access to XMLHttpRequest at ‘…’ from origin ‘https://esstsas.net‘ has been blocked by CORS policy: Response to preflight request doesn’t pass access control check: It does not have HTTP ok status.

This is the relevant JavaScript code that handles the image upload: I am using the Firebase JS SDK (v11.6.1). The storage object has been correctly initialized.

JavaScript

// This function is called when the "Save" button is clicked
async function saveInspection() {
    const file = lastSelectedFile; // lastSelectedFile is the user's selected image File object
    const userId = auth.currentUser.uid;

    if (!file || !userId) {
        console.error("File or User ID is missing.");
        return;
    }

    try {
        console.log("Step 1: Resizing image...");
        const resizedImageBlob = await resizeImage(file); // Resizes image to a Blob
        console.log("Step 2: Uploading to Storage...");

        const fileName = `${Date.now()}-${file.name}`;
        const storageRef = ref(storage, `images/${userId}/${fileName}`);
        
        // This is the line that fails
        const uploadTask = await uploadBytes(storageRef, resizedImageBlob);
        
        console.log("Step 3: Getting download URL...");
        const downloadURL = await getDownloadURL(uploadTask.ref);

        // ... code to save the downloadURL to Firestore would go here
        console.log("Upload successful:", downloadURL);

    } catch (error) {
        console.error("DETAILED UPLOAD ERROR:", error);
    }
}

What I’ve already tried (Troubleshooting Steps):

I am certain this is not a simple configuration error because I have performed extensive troubleshooting:

  • CORS Configuration: I used the gcloud CLI to set the CORS policy on my bucket. The command gsutil cors get gs://checksst-faster.firebasestorage.app confirms the policy is set correctly with my origin: [{"origin": ["https://esstsas.net", "https://www.esstsas.net"], "method": ["GET", "POST", "OPTIONS"], ...}]

  • Wildcard Test: As a test, I also applied a wildcard policy with "origin": ["*"]. The error still persisted.

  • Client-Side Caching: I have ruled out browser caching by performing a hard refresh (Ctrl+Shift+R), testing in incognito mode, using a completely different browser, and testing on a different network (mobile hotspot).

  • Firebase Rules: My Firestore and Storage security rules are correctly configured to allow access for authenticated users (allow read, write: if request.auth != null && request.auth.uid == userId;).

Even with the server-side CORS configuration confirmed as correct and after eliminating all client-side caching possibilities, the browser still receives a failed preflight response. I cannot create a formal Google Cloud support ticket as I am on the free plan.

Is there any other project-level setting, network issue, or known bug that could be overriding the bucket’s CORS policy?

Detect end of page in Safari on macOS 15 using plain JavaScript (zoom-safe)

How can I reliably detect the end of the page in Safari on macOS 15 using plain JavaScript, in a way that works across zoom levels and dynamic content?

I have the following code that scrolls the page downward smoothly:

window.__scrollInterval = setInterval(() => {window.scrollBy(0, 0.3, 'smooth');}, 0);

What I need is a reliable way to detect when the page has reached the bottom. The detection must work regardless of zoom level, which is where most solutions fail—especially in Safari.

When the end of the page is reached, I want to trigger:

alert('End');

Requirements:

  • Must work in Safari on macOS 15

  • Must be written in plain JavaScript (no libraries)

  • Must be robust against zoom level changes

Pages to test on:

loading icon cannot re-render everytime when the fetched data changed

I am working on a react front-end project want to add a loading icon for data loading time. It worked well at the initial time, but did not work if the data changed, here is the useEffect to handle this:

useEffect(() => {
  setLoading(true);
  if (!frameResults.length) {
    return;
  }
  if (typeof graphSetting?.movementDomain[1] === 'string') {
    if (graphSetting.movementDomain[1] === 'dataMax+5') {
      yAxisDomainRef.current = [0, getDataMax(frameResults) + 5];
    }
  } else if (typeof graphSetting?.movementDomain[1] === 'number') {
    yAxisDomainRef.current = graphSetting.movementDomain;
  }
}, [frameResults]);

useEffect(() => {
  setLoading(true);

  if (frameResults.length < 1) {
    return;
  }
  Boolean(selectedIdx)
  const options = new Array({ value: 0, label: t("Overview") })
    .concat(frameSectionStartTime.map(
      (time, idx) => ({
        value: idx + 1,
        label: `${t("Section")} ${idx + 1}: ${time.split(" ")[1]}`
      })
    ));
  setSectionOption(options);
  setSelected(1);
  setLoading(false);
}, [frameResults]);

useEffect(() => {
  setLoading(true);

  if (frameResults.length < 1) {
    return;
  } else if (selectedIdx == 0) {
    const sectionStart = new Date(frameSectionStartTime[0]);
    const lastframe = frameResults[frameResults.length - 1].length - 1;
    const lastStamp =
      frameResults[frameResults.length - 1][lastframe].timestamp;
    const sectionEnd = new Date(`${date} ${lastStamp}`);

    setMinimapDomain({
      x: [sectionStart, sectionEnd],
      y: yAxisDomainRef.current
    });
    setZoomDomain({
      x: [sectionStart, sectionEnd],
      y: yAxisDomainRef.current
    });
    setLoading(true);
  } else {
    const domainStart = new Date(frameSectionStartTime[selectedIdx - 1]);
    const lastframe = frameResults[selectedIdx - 1].length - 1;
    const lastStamp = frameResults[selectedIdx - 1][lastframe].timestamp;
    const sectionEnd = new Date(`${date} ${lastStamp}`);
    const sectionLength = sectionEnd - domainStart;
    const domainLength = 15000;
    const domainEnd = new Date();
    domainEnd.setTime(domainStart.getTime() + domainLength);

    setMinimapDomain({
      x: [domainStart, sectionLength >= 15000 ? sectionEnd : domainEnd],
      y: yAxisDomainRef.current
    });
    setZoomDomain({
      x: [domainStart, domainEnd],
      y: yAxisDomainRef.current
    });
    setLoading(false);
  }
}, [selectedIdx, frameResults]);

components part is like:

<Spin spinning={loading}>
  <VictoryChart>...</VictoryChart>
</Spin>

I think it may due to the fact that rendering time is too quick, so the loading state cannot got back to true again. I added a timeout, it can render every time frameResult data changed, but has a slightly delay. It should render immediately once the frameResult data changed, but it now has a slightly delay and then render the loading icon.

How to run vitest in FF and Chromium but gather coverage only for Chromiun via v8

I have a problem with vitest + coverage via V8. I want to run tests with vitest on many browsers, but i cannot even run tests because of error below

I have Vitest config like this for browser instances:

browser: {
  enabled: true,
  provider: 'playwright',
  instances: [
    { name: 'chromium', browser: 'chromium' },
    { name: 'firefox', browser: 'firefox' }
  ]
},

and coverage:

coverage: {
  provider: 'v8',
  reporter: ['html', 'cobertura'],
  reportsDirectory:'./coverage/chromium',

And when I run tests I see an error

Error: @vitest/coverage-v8 does not work with
{
  "browser": {
    "provider": "playwright",
    "instances": [
      {
        "browser": "chromium"
      },
      {
        "browser": "firefox"
      }
    ]
  }
}

Use either:
{
  "browser": {
    "provider": "playwright",
    "instances": [
      {
        "browser": "chromium"
      }
    ]
  }
}

...or change your coverage provider to:
{
  "coverage": {
    "provider": "istanbul"
  }
}

The main question is how to run test on all browsers but gather coverage only for Chromium?

Can the return value of the `load` hook in Node.js’s module customization hooks be passed to the next hook?

The Chaining section of the Node.js documentation mentions that the registered hooks will form chains.

Based on the load hook section, I’ve tried it myself.

I found that if the load hook wants to return an object with a processed source, shortCircuit must be set to true, otherwise it will throw an error:

"./my-hook.mjs 'load'" did not call the next hook in its chain and did not explicitly signal a short circuit. If this is intentional, include `shortCircuit: true` in the hook's return.

If I call the nextLoad function, its parameters also don’t seem to allow passing the processed result of my hook; the parameter can only pass a URL, but my hook’s processed result is in memory.

Does this mean my hook cannot pass its processed result to other hooks?
Am I misunderstanding something?

My code:

// my-hook.mjs
export async function load(url, context, nextLoad) {
  if (url.endsWith(".js")) {
    let source = await readFile(fileURLToPath(url), "utf-8");
    source = source.replace("hello", "123");
    return {
      source: source,
      shortCircuit: true, // cannot be false
      format: "module",
    };
  }
  return nextLoad(url);
}
// register-hooks.mjs
import { register } from "node:module";
register("./my-hook.mjs", import.meta.url);

node --import ./register-hooks.mjs a.js


Context:

I want to add a new syntax to TypeScript.

My idea is to write a load hook that processes .ts files and converts my syntax into standard TypeScript syntax. Then, I would pass the converted code to tsx.

Like this: node --import my-hook --import tsx a.ts

Is this feasible? If not, could you recommend other methods?