javascript getUTCDay() don’t get the day [closed]

Does someone has or is having same issue? How can I fix it? I need to split the date and grab the day, the month and the year. I tried with getDay() and same did not work too.

function splitDate() {
  //test to get the year of today
  let today = new Date();

  Logger.log(`today: ${today}`);
  let year = today.getUTCFullYear();
  Logger.log(`year: ${year}`);
  let month = today.getUTCMonth();
  Logger.log(`month: ${month}`);  
  let day = today.getUTCDay();
  Logger.log(`day: ${day}`);

  Logger.log(`day: ${day} - month: ${month} - year: ${year}`);

}

code and result

Weird problem with forms / radio buttons and arrow navigation using Angular and Template Driven Forms

I have a weird problem with one of my forms. I am using Template Driven Forms in Angular, and as soon as I want bind the selected value of radio buttons in a form via ngModel the navigation between the radio buttons and radio button groups via arrow keys and tab key stop working.

Without the ngModel Attribute I can jump from one radio button group to another using Tab. And in one radio button group I can use the arrow keys to select a specific option. I also cannot leave a radio button group using arrow keys. That’s all the way as it’s supposed to be.

But when I add ngModel then suddenly I can’t use the tab key anymore to jump between radio button groups and the arrow up and down keys are no longer trapped in one specific radio button group (Using Chrome Browser. Firefox also changes behaviour but in a different and also wrong way).

You can check it for yourself using the follow StackBlitz projects:
Without ngModel:
https://stackblitz.com/edit/stackblitz-starters-bwkfamcz?file=src%2Fapp%2Fform%2Fform.html

With ngModel:
https://stackblitz.com/edit/stackblitz-starters-fnptjpbf?file=src%2Fapp%2Fform%2Fform.html

Has anyone of you any idea, what is going on here?

Thanks in advance.

Refactoring a quasi legacy JavaScript codebase in TypeScript and dealing with data types [closed]

The project I am working on is a patchwork of at least 3 different ECMA versions and I have a question about application types.

In a folder named types there are files that define data structures. One of them contains:

const MATERIAL_TYPES = {
    TOOL: 'TOOL',
    GLASS: 'GLASS',
    GEAR: 'GEAR',
    BRICK: 'BRICK',
    SLAB: 'SLAB',
    CEMENT: 'CEMENT',
};

This works. Whenever a material needs to be referenced, the code imports this object and uses MATERIAL_TYPES.GLASS.

What I have so far is a @types folder at the root of the project with a tree of folders that loosely resemble the application’s:

root
├─── @types
│    └─── player
│    └─── items
│    └─── services
└─── src

Each folder has an index.ts file that exports the folder’s content and @types has an index.ts file, so anywhere in the code I just import type { SomeType } from "../@types.

For the material type above, I removed the object and instead created a type:

type MaterialType = {
    | 'TOOL',
    | 'GLASS',
    | 'GEAR',
    | 'BRICK',
    | 'SLAB',
    | 'CEMENT',
};

This way I “hardcode” the value:

function dealWithMaterial(material: MaterialType): string {}

// using the function
const result = dealWithMaterial("GLASS");

My questions are:

  1. Is this the right way to do it?
  2. If you have done this before and encountered this scenario, what did you do?
  3. Would you do it differently now? If so, how?

I know it is open ended, but I would like some thoughts by people who brought back old JS code to life with TypeScript.

Will new object assignment to same GLOBAL variable cause memory leaks?

I’m trying to set GLOBAL variable available across all modules of Node.js app.

module.exports = app => {
  app.get('/config', async (req, res) => {
     .....
     const { data } = await axios.get('.....')
     app.locals.MY_CONFIG = data.config

Will this line cause memory leaks every time the route is called?

app.locals.MY_CONFIG = [data.config]

in other words will this cause an issue:

app.locals.MY_CONFIG = [data.config]
app.locals.MY_CONFIG = [data.config]
app.locals.MY_CONFIG = [data.config]
app.locals.MY_CONFIG = [data.config]
...

I cant use import in nwjs

I use nwjs to to develop desktop applications; but i cant use “import” .
For example:

import { parseMetar } from "metar-taf-parser";

const metar = parseMetar(rawMetarString);

// -or-

// Optionally pass the date issued to add it to the report
const datedMetar = parseMetar(rawMetarString, { issued });

This is metar-taf-parser : metar-taf-parser
And i get this error:

Cannot use import statement outside a module

Why is sap.m.DatePicker rendering the calendar incorrectly?

I’m using SAPUI5 version 1.120.15 on an S4 gateway system. I am using SAPWebIDE to run this locally and it does it there too.

The DatePicker control is acting very strangely in a handful of apps. I have no idea why it is acting this way, does anyone know?

It looks like the week number field on the left side is being used for the date instead, causing it to look like there are 8 days in a week.
I tried putting a blank control elsewhere in the view <DatePicker /> and it still does this. I tried in a separate test app and it worked correctly.

Is there something about the app configuration that would cause this weird bug?

Thanks very much

enter image description here

Calling a Javascript async function with async/await when streaming JSON and parsing (Node.js)

I am trying to parse an enormous JSON stream in Javascript using Node.js, which is then stored in mariadb. As part of that, I have a table that will have a small number of rows (~50), but each row has a unique value. That way if I run across the value while parsing, I don’t need to add it to the DB if it’s already there.

The problem is, those 50 values will be used literally millions of times, so I don’t really want to check the DB before I do an insert — if I can do this in one query I’d be happier.

In order to do this, and since we’re talking 50 values here, I use a set. When I come across a value, I first check to see if it’s in the set and if not, I add it to the DB, then the set.

The problem is that occasionally the function executes a second time with the same value before the set got its item added, so I end up with an attempt to add a duplicate row. I’ve tried async/await, but I suspect that the whole thing is wrapped up in an async method that’s being called from the top level of the file, so at some point the async/await chain breaks and what should run synchronously no longer does.

Here’s the code I use to insert (DB stuff done via the mariadb connector):

DB file helpers

export const pool = mariadb.createPool({
    host: 'localhost',
    port: '3306',
    user: '*********',
    password: '**********'
});

export async function getBodyType(body, conn, map) {
    let subType = body.subType || "Other";

    if (subType && !map.has(subType)) {
        // Insert into the DB as we go, adding it to a set to ensure we don't duplicate
        db.insertBodyType(subType, conn)
            .then(result => {
                map.add(subType);
            });
    }

    return subType;
}

export async function insertBodyType(bodyType, conn) {
    try {
        await conn.query(queries.bodyTypesInsert, bodyType);
    } catch (err) {
        console.log(err);
    }
}

Before continuing, I’ve also tried the insert/add block as follows, which didn’t work either:

await db.insertBodyType(subType, conn);
map.add(subType);

Now here’s the way I call the above function, which is not itself in a function (it’s just the main body of the script):

let types = new Set();
const stream = fs.createReadStream(inputFile, 'utf8');
const parser = JSONStream.parse('*');
stream.pipe(parser)
    // I'm wondering if this is the culprit -- I don't know how to make the call to 
    // stream.pipe async or even if I can, so I don't know if making its body async even 
    // matters.
     .on('data', async (system) => {

        // Array of values we need for the system insert
        let systemData = [
            system.name, 
            system.coords.x, 
            system.coords.y, 
            system.coords.z, 
            helpers.isColonizable(system), 
            helpers.isPopulated(system), 
            helpers.canColonizeFrom(system)
        ];

        let bodyMap = new Map();

        // Somehow this sometimes calls twice before the first one finishes, despite the
        // fact that the whole chain from getBodyType() on up should be async
        for (let body of system.bodies) {
            let bodyType = await helpers.getBodyType(body, conn, types);
            if (!bodyMap.has(bodyType)) {
                bodyMap.set(bodyType, 0);
            }

            bodyMap.set(bodyType, bodyMap.get(bodyType) + 1);
        }
    }
})
.on('end', () => {
    db.pool.end();
})
.on('error', (err) => {
    db.pool.end();
});

This whole thing is run in node:

$ node do_this.js

Honestly, from what I’m reading on async/await here, I’m wondering if I’m even using the right tool for the job. I might be better off using something like C# that can handle true synchronicity with async methods, assuming I read things right.

SAP Fiori UI5 / I want to attach a custom Filter to every batch request assigned to a List Report (Analytical Table)

I am currently working on a standard Fiori Elements App. I use a List Report with an Analytical Table. My requirement is quite straightforward. I use a custom Data Provider in the backend that works perfectly with oData v4 but is also able to handle some custom functions via a filter fields.

Since I don’t want the filter to be part of the FilterBar of the Page and basically not be visible to the user, I want to directly attach it to the data binding of the table. The goal is that every batch request related to the Table automatically contains that custom filter.

Many Thanks in Advance!

Handle “onclick” button event in Maui Webview with HybridWebView

Handle “onclick” button event in Maui Webview with HybridWebView

I am writing a mobile application that works with a specific website. There are many buttons on this website and I need to handle the “onclick” events of these buttons.
For this, I tried to injected an eventlistener into the html when the site was navigated. I want to send a message to my application with the “onclick” event:

await webView.EvaluateJavaScriptAsync("window.addEventListener("HybridWebViewMessageReceived", function(e) {});" +
"var btns=document.querySelectorAll('button,input[type=button],input[type=submit]');" +
"for(var i=0;i<btns.length;i++){btns[i].addEventListener('click',function(e){ window.HybridWebView.SendRawMessage('Hello from JS!!!');});}");

I added a HybridWebView to the xaml section

<ContentPage xmlns="http://schemas.microsoft.com/dotnet/2021/maui"
             xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
             x:Class="EczasistanMaui.Pages.MedulaPage">
    
    <HybridWebView
    x:Name="hwv"
    DefaultFile="{Binding DefaultUrl}"
    RawMessageReceived="OnRawMessageReceived" />

</ContentPage>

But I can’t get any alert from here.

 private async void OnRawMessageReceived(object sender, HybridWebViewRawMessageReceivedEventArgs e)
    {
        // Event argument holds the message
        await DisplayAlert("Raw Message Received", e.Message, "OK");
    }

Where am I doing wrong?

Why does parent folder disappear from Azure Blob Storage when all child blobs are deleted?

I’m using Azure Blob Storage, and I understand that folders are simulated using blob names with / delimiters — there’s no real folder hierarchy.

Scenario:
I have a virtual folder structure like:
aman/verma/file.txt

I also have an explicitly created blob:
aman/verma/ (zero-byte blob to simulate folder)

After deleting all blobs under the prefix aman/verma/ — including the zero-byte blob aman/verma/ and file.txt — I noticed:

✅ The virtual folder aman/verma/ disappears — expected.
❌ The parent folder aman/ also disappears from listings if it had no other content, even though I never explicitly deleted aman/.

My question:
Why does the aman/ (parent folder) disappear from listings after deleting all blobs under aman/verma/, even if I never deleted aman/?

Is Azure automatically excluding aman/ from listings if it’s empty?

Does Azure treat zero-byte blobs like aman/ as ignorable unless they prefix other blobs?

How can I retain visibility of parent folders like aman/ even if all their subfolders/files are deleted?

How to reinitialize MailerLite embedded form on route change without page refresh?

I’m working on a React project where I have a common component used across multiple static pages (e.g., About Us, FAQs, Terms & Conditions, etc.).

Inside the footer, I include a MailerLiteEmbed component to show a newsletter subscription form using MailerLite’s embedded script.

Here’s the component:

import React, { useEffect } from 'react';
 
const MailerLiteEmbed = () => {
  useEffect(() => {
    // Inject MailerLite script
    (function (w, d, e, u, f, l, n) {
      w[f] = w[f] || function () {
        (w[f].q = w[f].q || []).push(arguments);
      };
      l = d.createElement(e);
      l.async = 1;
      l.src = u;
      n = d.getElementsByTagName(e)[0];
      n.parentNode.insertBefore(l, n);
    })(window, document, 'script', 'https://assets.mailerlite.com/js/universal.js', 'ml');
 
    // Initialize with account ID
    window.ml('account', 'YOUR_ACCOUNT_ID');
  }, []);
 
  return (
    <div className="ml-embedded" data-form="YOUR_FORM_ID"></div>
  );
};
 
export default MailerLiteEmbed;

Problem:

The form loads correctly only on the first page load or after a hard refresh. When navigating between routes using React Router, the component stays mounted and doesn’t reinitialize the MailerLite form.

What I’ve Tried:

  • Passing a unique key to the MailerLiteEmbed component on each route change.

  • Manually removing the and reinserting it (didn’t help).

  • Clearing the .ml-embedded container content manually.

Question:
How can I force the MailerLite script to reinitialize and render the embedded form on every route change in a React app without refreshing the page?

javascript async fetch response wait – blob – recursive download – videos fine images not fine

async function downloadFileAsync(url, filename, ext) {
    try {
            // Fetch the file as a Blob
            const response = await fetch(url);
            if (!response.ok) {
                throw new Error(`HTTP error! status: ${response.status}`);
            }
            //console.log("downloadFileAsync:ASYNC response OK for:" + filename + ext);
            const blob = await response.blob();

            // Create a temporary URL for the Blob
            //const fileURL = URL.createObjectURL(blob)
            const _type = getTypeByEXT(ext);
            const fileURL = URL.createObjectURL(new Blob([blob], { type: _type }));

            // Create a link element to trigger the download
            const link = document.createElement('a');
            link.href = fileURL;
            link.download = filename + ext; // Suggests a filename for the download
            link.style.display = 'none'; // Hide the link

            // Programmatically click the link to initiate download
            document.body.appendChild(link);
            link.click();

            // Remove the link from the body
            document.body.removeChild(link); // Clean up the link element

            // Revoke the object URL to free up memory
            URL.revokeObjectURL(fileURL);

            console.log(`${filename}${ext} downloaded successfully.`);
     } catch (error) {
            console.error(`Error downloading ${filename}${ext}:`, error);
     }
}

Line 14 from snippet above:

  1. const fileURL = URL.createObjectURL(blob);

    and
  2. const fileURL = URL.createObjectURL(new Blob([blob], { type: ext }));

Both work fine, if I am downloading a video.

Images never download.

Hence why I used MIME ‘image/jpg’ as type and the new Blob(…)

I used 1) first, always worked for videos, but not images.

I did research and found that I may have to use 2) instead.

Still no luck.

An aside is that I also use timing as well, so I do get the downloads.

setTimeout(() => {
        // Recursively call the function with the rest of the array (excluding the first element)
        processDownloadArrayWithDelay(arr.slice(1), delay);
    }, delay);

and also with

downloadFileAsync(url, fn, ext);

The reason why I use recursion and setTimeout is that Promises wasn’t working. Promises left unfulfilled. Like I said, videos are downloading fine, images never…

pc.oniceconnectionstatechange = () => { state disconnected

i am making the streamign app in react native so when host start the streamign and user user join the stream user able to see host stream but user start the stream host not able see user see that time show the ice state disconnected
in that function
const connectToStreamer = async (streamerId) => {
if (streamerId === socket.id || peerConnections.current[streamerId]) {
console.log(Skipping connection to ${streamerId}: Already connected or self);
return;
}

try {
  const pc = new RTCPeerConnection(iceServers);
  peerConnections.current[streamerId] = pc;

  // Decide direction based on whether you're sending a stream
  const isSendingStream = !!localStreamRef.current;

  if (isSendingStream) {
    localStreamRef.current.getTracks().forEach(track => {
      pc.addTrack(track, localStreamRef.current); // send
    });
  } else {
    pc.addTransceiver('video', { direction: 'recvonly' }); // receive
    pc.addTransceiver('audio', { direction: 'recvonly' }); // receive
  }

  pc.ontrack = event => {
    if (event.streams[0]) {
      console.log(`Received stream from ${streamerId}:`, event.streams[0]);
      setRemoteStreams(prev => new Map(prev).set(streamerId, event.streams[0]));
    }
  };

  pc.onicecandidate = event => {
    if (event.candidate) {
      console.log(`Sending ICE candidate to ${streamerId}`);
      socket.emit('ice-candidate', { target: streamerId, candidate: event.candidate });
    }
  };

  pc.oniceconnectionstatechange = () => {
    socket.emit('Errorlogs','pc.oniceconnectionstatechange',`ICE connection state changed for ${streamerId}: ${pc.iceConnectionState}`);
    console.log(`ICE state for ${streamerId}: ${pc.iceConnectionState}`);
    if (pc.iceConnectionState === 'failed' || pc.iceConnectionState === 'disconnected') {
      setTimeout(() => {
        if (pc.iceConnectionState !== 'connected' && peerConnections.current[streamerId]) {
          console.log(`Retrying connection to ${streamerId}`);
          connectToStreamer(streamerId);
        }
      }, 5000);
    } else if (pc.iceConnectionState === 'closed') {
      delete peerConnections.current[streamerId];
      setRemoteStreams(prev => {
        const newStreams = new Map(prev);
        newStreams.delete(streamerId);
        return newStreams;
      });
    }
  };

  const offer = await pc.createOffer();
  await pc.setLocalDescription(offer);
  console.log(`Sending offer to ${streamerId}`);
  socket.emit('offer', { target: streamerId, sdp: offer });
} catch (err) {
  socket.emit('Errorlogs',err)
  console.error(`Error connecting to streamer ${streamerId}:`, err);
  delete peerConnections.current[streamerId];
}

};

show hte host joined all suers streams and all suers shwo the each other stream in that rooms

How to get eventFilters to trigger on a Firestore onUpdate in Firebase Functions v2 when nested value is null or undefined

I’m trying to trigger a Firebase Cloud Function (v2) only when a specific nested field is added to a Firestore document. That means that the value for that specific field should be undefined in the event.data.before and defined in the event.data.after. According to multiple GPTs, eventFilters should allow this, but it’s not firing as expected.

Let’s pretend my firestore document’s shape is as follows:

{
  foo?: {
     bar?: string;
  }
}

Here’s my firebase function code that I wish would trigger if and only if bar is undefined or foo is undefined (and thus bar is also undefined) before and bar is defined afterwards.

Here’s my attempts:

export const onBarAdded = onDocumentUpdated(
  {
    document: 'users/{uid}',
    eventFilters: {
      'oldValue.fields.foo.bar': 'null',
      'value.fields.foo.bar.stringValue': '*'
    },
export const onBarAdded = onDocumentUpdated(
  {
    document: "users/{uid}",
    eventFilters: {
      "data.before.foo.bar": "== null",
      "data.after.foo.bar": "!= null",
    },
  },

I can’t find good documentation either. I find the online documentation sparse and the documentation in the code is a TODO:

// firebase-functions/lib/v2/options.d.ts
// ...
/**
 * Additional fields that can be set on any event-handling function.
 */
export interface EventHandlerOptions extends Omit<GlobalOptions, "enforceAppCheck"> {
    /** Type of the event. Valid values are TODO */
    eventType?: string;
    /** TODO */
    eventFilters?: Record<string, string | Expression<string>>;
// ... 

Feel free to tell me it’s not possible or, maybe, what industry standard is.