JavaScript WebView on Android, need to be able to detect the virtual keyboard being removed (Cordova App)

I have a Cordova app We Vote Ballot Guide that is having trouble when the Android hide keyboard button is pressed on Android’s navigation bar at the bottom of the screen.

When the virtual keyboard is displayed the Android navigation bar’s ‘backbutton’ (the right most icon), it turns into a down arrow which then is a “virtual keyboard close button”.

I need a way to detect the pressing of the “Android keyboard close button”, or a way to disable it.

My app has its own bottom navigation bar, that I need to hide when the virtual keyboard appears (so that there is enough room left on the screen to see the input field on the underlying page.)

It is easy enough to hide my app’s nav bar when the focus goes to the input field, and restore it when the input field loses focus. The problem is that I can’t find a way to detect if the Android “virtual keyboard close button” has been clicked.

iOS has a keyboardDidHide eventListener which allows me to know when the virtual keyboard goes away, but Android does not seem to have the equivalent.

I tried:

  1. navigator.virtualKeyboard.addEventListener('geometrychange' ()=>{console.log('BUTTONTEST geometry'});, but it was not invoked.

  2. document.addEventListener('keydown', (event) => { console.log('BUTTONTEST keydown: ', JSON.stringify(event)); });, but it did not return useful data.

  3. const hideKeyboardButton = document.getElementById('hideKeyboardButton');, but that element does not exist (at least in Android).

  4. window.addEventListener('keyboardDidHide', console.log('BUTTONTEST keyboardDidHide'));, which only works on iOS.

generating public token via server action and set it on cookies on client component forward by router refresh sometimes not available sever actions

When using Next.js Server Actions to generate a token and then set a cookie on the client, calling router.refresh() immediately after may not always expose the cookie to subsequent server actions or server components.

'use server';

import {cookies} from 'next/headers';
import {defaultLocale} from './constants';

const baseFetch = async ({
  endPoint,
  method = 'GET',
  options = {},
  queryParams = {},
  body,
  url,
  headers = {},
}) => {
  try {
    const appCookies = await cookies();
    const publicToken = appCookies.get('token')?.value;
    const locale = appCookies.get('NEXT_LOCALE')?.value || defaultLocale;

    const queryString = new URLSearchParams(queryParams).toString();
    const fullUrl =
      url ||
      `${process.env.NEXT_PUBLIC_API_URL}${endPoint}${
        queryString ? `?${queryString}` : ''
      }`;

    const response = await fetch(fullUrl, {
      method,
      headers: {
        ...(publicToken && {Authorization: `Bearer ${publicToken}`}),
        'Content-Type': 'application/json',
        'Accept-Language': locale,
        Accept: 'application/json',
        'User-Agent': 'Next.js Server',
        ...headers,
      },
      ...(body && {body: JSON.stringify(body)}),
      cache: 'no-store',
      ...options,
    });

    const text = await response.text();
    let json;
    try {
      json = text ? JSON.parse(text) : null;
    } catch {
      json = {error: text};
    }

    if (!response.ok) {
      return {
        ...json,
        status: response?.status,
        headers: Object.fromEntries(response.headers.entries()),
        endPoint,
      };
    }

    return {
      data: json,
      headers: Object.fromEntries(response.headers.entries()),
    };
  } catch (err) {
    return {
      error: err?.message || 'Unexpected error occurred',
    };
  }
};

export default baseFetch;

'use client';

import {generatePublicToken} from '@/services';
import {setCookie} from 'cookies-next';
import {useEffect, useState} from 'react';
import useRouter from './useRouter';
import useHandleAnalytics from './useHandleAnalytics';
import {useDispatch, useSelector} from 'react-redux';
import {getPublicToken} from '@/selectors/auth';
import {setPublicToken} from '@/slices';
import {ONE_MONTH_IN_SECONDS} from '@/lib';

const useHandlePublicToken = () => {
  const publicToken = useSelector(getPublicToken);
  const [isCreatingPublicToken, setIsCreatingPublicToKen] = useState(false);
  const dispatch = useDispatch();
  const [createPublicTokenError, setIsCreatePublicTokenError] = useState(null);
  const router = useRouter();
  const {logException} = useHandleAnalytics();

  const onCreatePublicToken = async () => {
    setIsCreatingPublicToKen(true);
    setIsCreatePublicTokenError(null);

    const publicTokenRequest = await generatePublicToken();
    if (publicTokenRequest?.data?.access_token) {
      dispatch(setPublicToken(publicTokenRequest?.data?.access_token));
      setCookie('token', publicTokenRequest?.data?.access_token, {
        maxAge: ONE_MONTH_IN_SECONDS,
      });
      router.refresh();
    } else {
      setIsCreatePublicTokenError({
        error: publicTokenRequest?.error || publicTokenRequest?.errors?.[0],
        requestId: publicTokenRequest?.headers?.['x-request-id'],
      });
      logException({
        description:
          publicTokenRequest?.error || publicTokenRequest?.errors?.[0],
        requestId: publicTokenRequest?.headers?.['x-request-id'],
        endpoint: publicTokenRequest?.endPoint,
      });
    }

    setIsCreatingPublicToKen(false);
  };

  useEffect(() => {
    if (!publicToken && !isCreatingPublicToken) {
      onCreatePublicToken();
    }
  }, [publicToken]);

  return {
    isCreatingPublicToken,
    createPublicTokenError,
    onRetry: onCreatePublicToken,
  };
};
export default useHandlePublicToken;

'use server';

import baseFetch from '@/lib/baseFetch';

export const generatePublicToken = async () => {
  const response = await baseFetch({
    endPoint: `oauth/token`,
    method: 'POST',
    body: {
      grant_type: 'client_credentials',
      client_id: process.env.CLIENT_ID,
      client_secret: process.env.CLIENT_SECRET,
    },
  });
  return response;
};


then if this hook called

  useEffect(() => {
    if (
      !cartId &&
      storeId &&
      publicToken &&
      !isCreatingOrder &&
      !creatingOrderError
    ) {
      handleNewOrderCreation();
    }
  }, [cartId, storeId, publicToken, isCreatingOrder]);
  const handleNewOrderCreation = async () => {
    clearOrderCookies();

    const response = await onCreateOrder();
    if (response?.error || response?.errors) {
      logException({
        description: response?.error || response?.errors?.[0],
        requestId: response?.headers?.['x-request-id'],
        endpoint: response?.endPoint,
      });
    } else {
      router.refresh();
    }
  };

  const onCreateOrder = async () => {
    const storeId = getCookie('storeId');
    const tableId = getCookie('tableId');

    setIsCreatingOrder(true);
    setCreatingOrderError(null);
    const storeResponse = await getStore(storeId);

    if (storeResponse?.error || storeResponse?.errors) {
      setCreatingOrderError({
        error: storeResponse?.error || storeResponse?.errors?.[0],
        requestId: storeResponse?.headers?.['x-request-id'],
      });
      setIsCreatingOrder(false);
      return storeResponse;
    }

    const deliveryMethodId =
      storeResponse?.data?.stores?.delivery_methods?.[0]?.id;

    const orderData = {
      store_id: storeId,
      delivery_method_id: deliveryMethodId,
      ...(tableId && {table_id: tableId}),
    };

    const response = await createCart(orderData);

    if (response?.error || response?.errors) {
      setCreatingOrderError({
        error: response?.error || response?.errors?.[0],
        requestId: response?.headers?.['x-request-id'],
      });
      logException({
        description: response?.error || response?.errors?.[0],
        requestId: response?.headers?.['x-request-id'],
        endpoint: response?.endPoint,
      });
    } else {
      setCookie('cartId', response?.data?.orders?.id, {
        maxAge: ONE_MONTH_IN_SECONDS,
      });
      setCookie('cartStoreId', response?.data?.orders?.store?.id, {
        maxAge: ONE_MONTH_IN_SECONDS,
      });
      setCookie('guest_token', response?.data?.orders?.guest_token, {
        maxAge: ONE_MONTH_IN_SECONDS,
      });
    }

    setIsCreatingOrder(false);

    return response;
  };
export const createCart = async body => {
  const response = await baseFetch({
    endPoint: `api/v1/orders/create_cart`,
    method: 'POST',
    body: {...body, source: process.env?.CLIENT_SOURCE},
  });
  return response;
};

so baseFetch when getting token from cookies sometimes it’s undefined for random users

How can I create a RegEx pattern that matches strings in both directions?

So, I have written a RegEx, all by myself. It works quite well, although there is one problem; I cannot figure out how to make that RegEx work both ways. Currently it supports this:

€ 10
€10
EUR 10
EUR10

But I want it to be able to support this:

10€
10 €
10 EUR
10EUR

This is my current RegEx:

const regex = /(?<!w)(?:€s?|EURs?)([0-9]{1,3}(?:[.,s][0-9]{3})*(?:[.,][0-9]+)?|d+(?:[.,]d+)?)(?:s?(?:€|EUR))?(?!w)/g;

How to check if you have multiple nested objects in an object in javascript? [duplicate]

I’m trying to find a way to check if there is another nested object in an object.

const data = {animal: "dog"}

// 2nd layer keys test1: Expected true; Result: false
console.log(Object.keys(data.animal) === "undefined");        

// 2nd layer keys test2: Expected undefined; Result: [ '0', '1', '2' ]
console.log(Object.keys(data.animal));                        

// 2nd layer values test1: Expected true; Result false
console.log(Object.values(data.animal) === "undefined");      

// 2nd layer values test2: Expected undefined; Result [ 'd', 'o', 'g' ]
console.log(Object.values(data.animal));                      

502 Gateway Timeout when generating large Excel reports (60MB+) with ExcelJS and AWS — works for 1 month, fails for 2-3 months range

I’m working on a Node.js + Express backend where I need to generate and download Excel reports from MongoDB using ExcelJS.
When the user selects a large date range (2-3 months) — around 60 MB of Excel data — the server times out with a 502 Gateway Timeout on AWS.

When I select a 1-month range, it works fine.

What I tried:

Initially, my code generated a single Excel file for the entire range:

   downloadReportsExcel: async (req, res) => {
    try {
    req.setTimeout(0); 
    const { from, to } = req.query;
    const fromDate = new Date(from);
    const toDate = new Date(to);

    const reports = await Report.find({
      createdAt: { $gte: fromDate, $lte: toDate },
    }).populate("case reportedBy");

    const workbook = new ExcelJS.stream.xlsx.WorkbookWriter({ stream: res });
    const worksheet = workbook.addWorksheet("Reports");

    worksheet.columns = [
      { header: "CONTRACTOR NAME", key: "contractorName", width: 25 },
      // ...other columns
    ];

    res.setHeader(
      "Content-Disposition",
      `attachment; filename="Reports_${from}_to_${to}.xlsx"`
    );

    for (const report of reports) {
      worksheet.addRow({
        "CONTRACTOR NAME": report.contractorName || "N/A",
        // ...
      }).commit();
    }
    worksheet.commit();
    await workbook.commit();
  } catch (err) {
    console.error(err);
  }
};

✅ This worked for smaller date ranges (1 month),
❌ But failed for larger ranges (2–3 months, ~60MB file) with 502 Gateway Timeout after 2–3 minutes (AWS default limit).

Attempted fix (split into monthly chunks and zip)

To fix it, I tried splitting the range into monthly chunks, generating separate Excel files for each month, and then zipping them together:

    const chunks = getMonthlyChunks(fromDate, toDate);
    const zip = new JSZip();

for (const chunk of chunks) {
  const reports = await Report.find({
    createdAt: { $gte: chunk.start, $lte: chunk.end },
  });

  const workbook = new ExcelJS.Workbook();
  const worksheet = workbook.addWorksheet("Reports");

  worksheet.columns = [...];

  for (const report of reports) {
    worksheet.addRow({...});
  }

  const buffer = await workbook.xlsx.writeBuffer();
  const chunkName = `Reports_${chunk.start.toISOString()}_${chunk.end.toISOString()}.xlsx`;
  zip.file(chunkName, buffer);
}

const zipBuffer = await zip.generateAsync({ type: "nodebuffer" });
res.setHeader("Content-Type", "application/zip");
res.setHeader("Content-Disposition", `attachment; filename="Reports.zip"`);
res.send(zipBuffer);

✅ Works locally for 1-month data,
❌ Still times out for 2–3 months on AWS (file ~60 MB, 2–3 min processing).

How do I load multiple folders in one Astro content collection?

I want to make a blog page where it queries all the blogs I have.
Here’s the current file tree of what I’m thinking of doing. The reason I want to have multiple folders is so I can put these posts in a more organized way.

src
├── blog
│   ├── config.ts
│   ├── dist
│   ├── images
│   └── posts
│       ├── 2024
│       │   └── huntress-ctf.mdx
│       └── 2025
│           └── netcomp-ctf.mdx

But when I try to import the collection, it returns this:

The collection "posts" does not exist or is empty. Please check your content config file for errors.
[]

Here’s what my config currently looks like:

import { z, defineCollection } from "astro:content";
import { glob } from "astro/loaders";

const posts = defineCollection({
  loader: glob({ pattern: "{2024,2025}/**/[^_]*.md", base: "./src/blog/posts" }),
  schema: z.object({
    title: z.string(),
    pubDate: z.date(),
    description: z.string(),
    tags: z.array(z.string()),
  }),
});

export const collections = { posts };

And I tried loading my collection like this:

---
import { getCollection } from "astro:content";
import Layout from '../layouts/Layout.astro';

const posts = await getCollection("posts");
console.log(posts);
---

I wonder what I’m doing wrong here, I want to find a workable solution of:

  • Keeping my posts organized in folders
  • Importing my post collection through one collection
  • Side note: I also want my posts to be able to refer to the dist (for file downloads) and images (for optimised images embedded using [alt]() instead of something like <Image>), so I also need to know how to reference them properly in my blog posts through imports/collections because I’m still confused on how to approach this

How do I preserve the colors of an image when I lower its overall opacity to almost transparent? (html javascript) [duplicate]

When I convert an image to almost transparent (1% opacity), either by using imageData or globalAlpha, the original color data is lost in the conversion. When I open up the saved image in any image editing software (I checked and the image editing software is not the problem here) and put it back to full opacity there, I notice that only 16 or 24 colors remain. The image looks like as if it was posterized.

For example, if my image had a color “#90F59A”, after the conversion, the color would now become “#AAFFAA”.

const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height);
const data = imageData.data;

ctx2.clearRect(0, 0, canvas2.width, canvas2.height);

for (let i = 0; i < data.length; i += 4) {
    const red = data[i];
    const green = data[i + 1];
    const blue = data[i + 2];
    const alpha = data[i + 3];
    if (alpha == 0 || (red == 0 && green == 0 && blue == 0)) {
        // black pixels become fully transparent
        data[i] = 0;     // Red
        data[i + 1] = 0; // Green
        data[i + 2] = 0; // Blue
        data[i + 3] = 0; // Alpha
    } else {
        // all other pixels turn almost transparent
        data[i + 3] = 255 * 0.01;
    }
}
ctx2.putImageData(imageData, 0, 0);

I tried using globalAlpha and found that the quality was reduced. I then tried using imageData because I thought it would preserve the image data since I was modifying the image’s data itself (or the array or color values the image contains), but it also didn’t work.

I also tested if the color quality loss would happen with images that are more opaque (like 80%), and saw that the image quality did not get lower. I also tested with an opacity that is just a bit higher than 1% (like 3%) and saw that there was less reduction in color quality, so I assume the color quality loss was because of some computation with transparent pixels.

I was expecting the image to become almost transparent (1% opacity) but still have the colors be the same.

storybook error: “Cannot read properties of undefined (reading ‘tap’)”

trying to run storybook (v8.6.14) in a next.js 14 project with @storybook/nextjs,
but it keeps breaking on startup with this error:

TypeError: Cannot read properties of undefined (reading 'tap')

stack trace points to:
node_modules/next/dist/compiled/webpack/bundle5.js

looks like storybook is somehow using next’s internal webpack instead of its own.

already tried:

removed all addons (even @storybook/addon-styling-webpack)

cleared cache + reinstalled everything

tried forcing storybook to use its own webpack via @storybook/builder-webpack5 and aliasing webpack

made sure everything’s on webpack 5.101

still same thing, crashes the moment it starts

weird thing is – I had it working few times on another local clone of the same repo, but after working on it for a while, it started happening there also, so weird.. feels like cache problem but I really don’t know why non of the things I have done fixed it

I will appreciate your help 🙂

main.ts:

import type { StorybookConfig } from '@storybook/nextjs'

const config: StorybookConfig = {
  stories: [
    '../ui/**/*.stories.@(js|jsx|mjs|ts|tsx)',
  ],
  addons: [],
  framework: {
    name: '@storybook/nextjs',
    options: {},
  },
  staticDirs: ['../public'],
  docs: {
    autodocs: 'tag',
  },
}
export default config

Is it possible to use the “/” route for a controller in Laravel with Inertia?

What I want to set up in .routesweb.php is something like

Route::resource('/main', CallController::class);

So, if user goes to www.sitename.com/, the index of CallController should be executed and displayed. However, if I try to compile that with npx vite build, I am invariably met with an error of such type:

Expected identifier but found ":"
216 |   * @route '/{}'
217 |   */
218 |  export const show = (args: { : string | number } | [param: string | number ] | string | number, options?: RouteQueryOptions): RouteDefinition<'get'> => ({
    |                               ^
219 |      url: show.url(args, options),
220 |      method: 'get',

The error only seems to go away, if some other path other that / is being assigned to as the path for the CallController‘s index.

The only workaround that I have found is to set the resource route as /calls and redirect from it from `/’:

Route::get('/', function () {
        return redirect('/main');
    })->name('home');

But I wonder, whether it is the intended behaviour and if it is possible to use / as a route for a resource/controller?

Why are my Laravel validation errors not showing in the Blade view? [closed]

I made a form in Laravel and added validation rules in the store() method.
The validation works because when I submit invalid data, the page refreshes and the form doesn’t submit — but I don’t see any error messages on the page.

Here’s part of my form in Blade:

<form action="{{ route('register') }}" method="POST">
    @csrf
    <input type="text" name="name" placeholder="Name">
    <button type="submit">Register</button>
</form>

And the validation inside my request file:

public function rules()
{
    return [
        'name' => 'required|min:3',
    ];
}

I tried using @error('name') inside my form, but it still doesn’t display anything.
I expected to see the validation message like “The name field is required.” but the form just reloads.
How can I properly show Laravel validation errors in a Blade view?

How can I display dynamic menu items like the 7 Brew Menu using JavaScript?

I’m trying to build a dynamic coffee shop menu for a project, kind of like the 7 Brew Menu, where each category (like drinks, flavors, add-ons, etc.) updates based on user selection.

I’m using HTML, CSS, and vanilla JavaScript, but I’m not sure how to structure the data so it’s easy to update and display.

For example:

When a user clicks on “Iced Drinks,” it should show only iced drink items.

Each item should have a name, price, and maybe an image.

What’s the best way to store and render this kind of data — should I use an array of objects or fetch it from a JSON file?

Any advice or code examples would be super helpful!

What I tried:
I tried creating an array of menu items in JavaScript and displaying them using a simple forEach loop. It works, but all items show at once instead of updating when I select a category (like “Iced Drinks” or “Blended Drinks”).

What I expected:
I expected only the selected category items to display — similar to how the 7 Brew Menu updates depending on drink type.

What actually happened:
All the menu items appeared together on the page, and the filter buttons didn’t change the displayed results.

Why doesn’t my v-model show the initial value in Vue 3?

I’m building a simple component in Vue 3 that uses v-model to bind an input field.
When I set the variable value inside mounted(), it updates correctly in the paragraph but the input itself stays empty.
Here’s my component:

<template>
  <div>
    <input type="text" v-model="name">
    <p>{{ name }}</p>
  </div>
</template>

<script>
export default {
  data() {
    return { name: '' }
  },
  mounted() {
    this.name = 'John'
  }
}
</script>

I tried setting the initial value directly in the data() function and it worked, but I need to assign it dynamically later.
I also checked the browser console and saw no errors.
I expected the input to show “John” automatically when the component loads, but it only updates when I type something manually.
Why doesn’t the input reflect the data value on mount?

How do I change the Google PlaceAutocompleteElement width?

Unlike the past, Google’s new PlaceAutocompleteElement imposes its own styling on the input and it clashes with my site.

In particular, the width is not full-width / fill. I see Google allows a small handful of overrides. However, is there any way to ensure the autocomplete to fills its container?

In one example, the autocomplete is appended to the body and it does fill the width. In another example, the autocomplete appears to be constrained. But neither examples explain how the width is controlled.

I did try the following and it does not work:

const autocompleteElement = new window.google.maps.places.PlaceAutocompleteElement()
autocompleteElement.style.colorScheme = "light"
autocompleteElement.style.width="100%"
autocompleteElement.width = "100%"
myElement.appendChild(autocompleteElement)