79592717

Date: 2025-04-25 14:33:33
Score: 5.5
Natty:
Report link

I might if found a fix but without more context on how this is setup I cant be fully confidant that this will help.

If found a GitHub post that seems like they have the same issue but they got a fix

GitHub Post

And this video

Youtube Video

Reasons:
  • Blacklisted phrase (1): this video
  • Low length (0.5):
  • No code block (0.5):
  • Me too answer (2.5): have the same issue
  • Low reputation (1):
Posted by: Coder Guy

79592715

Date: 2025-04-25 14:32:32
Score: 0.5
Natty:
Report link

Try using $form->setEntity($entity) instead of setModel()

Reasons:
  • Low length (1.5):
  • Has code block (-0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: DarkSide

79592713

Date: 2025-04-25 14:31:32
Score: 1.5
Natty:
Report link

the Schema.prisma was given that the output link like this

generator client {
  provider = "prisma-client-js"
  output   = "../lib/generated/prisma"
  
}

remove the output = "../lib/generated/prisma"

generator client {
  provider = "prisma-client-js"
  
}
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Saif Sabry

79592712

Date: 2025-04-25 14:30:32
Score: 0.5
Natty:
Report link

As of May 2023 -metadata rotate has been deprecated.

Use instead:

ffmpeg -display_rotation <rotation_degrees> -i <input_file> -c copy <output_file>

(This of course does not cover all possible options etc.)

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: AMeiri

79592703

Date: 2025-04-25 14:24:30
Score: 2.5
Natty:
Report link

I had two problems:

  1. Duplicate Gson configuration (code + yml)... this fixed the Map name.

  2. The keys of the map are used as-is, because Gson formats only fields.
    My solution was to copy the code that formats the field names and use it before inserting into the Map.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Stefan

79592698

Date: 2025-04-25 14:20:29
Score: 2
Natty:
Report link

Leaving this here If anyone needs to describe Network Rules specifically, you will need 'USAGE' on the schema where the network rule lives and have the 'Ownership' of the Network rule.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Artur Adam

79592695

Date: 2025-04-25 14:17:29
Score: 1
Natty:
Report link

I encountered the same error in Visual Studio 2022, and updating Entity Framework to version 6.5.1 resolved the issue.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: Mohamed Badr

79592689

Date: 2025-04-25 14:14:28
Score: 2.5
Natty:
Report link

:

Create a XML document which contains details of cars car like: id, company name, model, engine and mileage and display the same as a table by using element by using XSLT.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Durga

79592688

Date: 2025-04-25 14:13:27
Score: 3
Natty:
Report link

Had the same issue, removing org.slf4j.slf4j-simple from the dependencies solved the issue.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: nullpointerexception

79592683

Date: 2025-04-25 14:11:27
Score: 4.5
Natty:
Report link

Absolutely agree. Have the same problem with grails 5.x

furthermore there are no examples available how to customize scaffolding to get the result needed...
documentation or sources for the fields-taglib is also not available. really sad.

a really good product killed by too many features...

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Me too answer (2.5): Have the same problem
  • Low reputation (1):
Posted by: frissner

79592681

Date: 2025-04-25 14:10:26
Score: 0.5
Natty:
Report link

You can use AIRegex in AiUtil.FindText / AIUtil.FindTextBlock.

However, a UFT version at least from 2023 is required.

Set regex = AIRegex("some text (.*)")
AIUtil.FindTextBlock(regex).CheckExists True
Reasons:
  • Whitelisted phrase (-1.5): You can use
  • Low length (1):
  • Has code block (-0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: Uwe Görsch

79592680

Date: 2025-04-25 14:10:26
Score: 2.5
Natty:
Report link

You have to use the services to get a real pov. I've deployed applications in AWS. The adminstration focus when using R53 is way greater than using RDS.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Elbashir Saror

79592678

Date: 2025-04-25 14:10:26
Score: 3.5
Natty:
Report link

just checking in to see if this issue has been resolved. I'm currently encountering the same problem. Thank you!

Reasons:
  • Blacklisted phrase (0.5): Thank you
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: hw.chen

79592665

Date: 2025-04-25 13:59:23
Score: 8.5 🚩
Natty: 5
Report link

I'm looking for pretty much the same question. Want one entire task-group to finish before it starts the next parallel task-group (3 parallel task groups at a time). Were you able to find a good solution to this?

Reasons:
  • RegEx Blacklisted phrase (1): Were you able to find a
  • RegEx Blacklisted phrase (3): Were you able
  • Low length (0.5):
  • No code block (0.5):
  • Ends in question mark (2):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Brandon Kiso

79592660

Date: 2025-04-25 13:56:22
Score: 1
Natty:
Report link

After time spent on this, I want to share my findings. Maybe those will be useful.

Enabling w1-gpio on STM32MP1 board

I was able to run w1-gpio kernel module on STM32MP135F-DK board, by using this simple patch to device tree:

######################################################################### 
# Enable w1-gpio kernel module on PF10 GPIO
#########################################################################
diff --git a/stm32mp135f-dk.dts.original b/stm32mp135f-dk.dts
index 0ff8a08..d1ee9ba 100644
--- a/arch/arm/boot/dts/st/stm32mp135f-dk.dts
+++ b/arch/arm/boot/dts/st/stm32mp135f-dk.dts
@@ -152,6 +152,12 @@
        compatible = "mmc-pwrseq-simple";
        reset-gpios = <&mcp23017 11 GPIO_ACTIVE_LOW>;
    };
+   
+   onewire: onewire@0 {
+           compatible = "w1-gpio";
+       gpios = <&gpiof 10 GPIO_OPEN_DRAIN>; // PF10
+       status = "okay";
+       };
 };
 
 &adc_1 {

When using Yocto and meta-st-stm32 layer, to apply the patch, simply add it to SRC_URI in linux-stm32mp_%.bbappend file.

Enabling certain kernel modules is also required, I have done that by creating w1.config file:

CONFIG_W1=m                         # 1-Wire core
CONFIG_W1_MASTER_GPIO=m            # GPIO-based master
CONFIG_W1_SLAVE_THERM=m            # Support for DS18B20
CONFIG_W1_SLAVE_DS28E17=m

In linux-stm32mp_%.bbappend this w1.config should be add as: KERNEL_CONFIG_FRAGMENTS:append = "${WORKDIR}/w1.config"

This should be enough to run w1-gpio, and read temp. from DS18B20 sensor.

Later on I was able to modify w1-gpio module to support my custom slaves. I add those slaves manually (via sysfs), all under non-standard family code. When w1 core has some slave with family code that is not supported with any dedicated library, then one can use sysfs file called rw to read/write to that slave. It works with my slaves, although there are lot of problems with stability. I use a C program to read/write to that rw file, but nearly half of read operations fail, because master looses timing for some microseconds. I think it's due to some CPU interrupts coming in. I am thinking about using kernel connector instead of rw sysfs file, like described here

Reasons:
  • RegEx Blacklisted phrase (1): I want
  • Long answer (-1):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Mateusz Michala

79592656

Date: 2025-04-25 13:54:21
Score: 0.5
Natty:
Report link

@mattrick example

I followed @mattrick example of using a IntersectionObserver giving a bound on the rootMargin and attached it to the physical header. I am just answering for the sake of adding additional information to @mattrick's answer since @mattrick didn't provide an example.

IntersectionObserver

IntersectionObserver emits a IntersectionObserverEntry when triggered, which has a isIntersecting property that indicates whether or not the actual header is intersecting the viewport or the element.

In this case:

Implementation

Note that my implemenation is using Tailwind and Typescript but can be created in base CSS and JS.

Basic HTML

<!doctype html>
<html>
  <head></head>

  <body class="flex flex-col min-h-screen">
    <header id="header" class="banner flex flex-row mb-4 p-4 sticky top-0  z-50 w-full bg-white"></header>

      <main id="main" class="main flex-grow"></main>
    <footer class="content-info p-4 bg-linear-footer bottom-0 mt-4"></footer>
  </body>
</html>

Note: The <header> requires a id of header for the js to reference the element.

Typescript (Js) Implementation

export class Header {
    static checkSticky() {
        const header = document.getElementById("header");

        if (header == null) {
            return; // Abort
        }

        const observer = new IntersectionObserver(
            ([entry]) => this._handleStickyChange(entry , header) ,
            {
              rootMargin: '-1px 0px 0px 0px', 
              threshold: [1], 
            }
        );

        observer.observe(header);
    }


    static _handleStickyChange(entry : IntersectionObserverEntry , header : HTMLElement ) {
        if (!entry.isIntersecting) {
            header.classList.add("your-class");
            return; // Abort further execution
        }

        header.classList.remove("your-class");
    }
}

Call Header.checkSticky() when the DOM is ready to start observing the header. The observer will trigger _handleStickyChange() reactively based on whether the header is intersecting the viewport.

This allows you to add visual effects (e.g., shadows, background changes) or trigger callbacks when the header becomes sticky.

Thanks @mattrick for your initial contribution.

Reasons:
  • Blacklisted phrase (0.5): Thanks
  • Long answer (-1):
  • Has code block (-0.5):
  • User mentioned (1): @mattrick
  • User mentioned (0): @mattrick
  • User mentioned (0): @mattrick's
  • User mentioned (0): @mattrick
  • User mentioned (0): @mattrick
  • Low reputation (0.5):
Posted by: Cat100

79592654

Date: 2025-04-25 13:53:21
Score: 1.5
Natty:
Report link
oldlist = ["Peter", "Paul", "Mary"]

newlist = list(map(str.upper, oldlist))

print(newlist)

['PETER', 'PAUL', 'MARY']
Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: anody

79592646

Date: 2025-04-25 13:47:19
Score: 3
Natty:
Report link

The solution is described in this thread:

https://github.com/expressive-code/expressive-code/issues/330

Reasons:
  • Whitelisted phrase (-1): solution is
  • Probably link only (1):
  • Low length (1.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: marko kraljevic

79592639

Date: 2025-04-25 13:43:19
Score: 2
Natty:
Report link

Duplicate of Intel HAXM is required to run this AVD - Your CPU does not support VT-x

This issue has already been addressed in the post linked above. The error typically occurs when:

  1. Your CPU does not support Intel VT-x / AMD-V, or

  2. VT-x is disabled in the BIOS/UEFI settings.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Bigtree Bigtree

79592635

Date: 2025-04-25 13:39:18
Score: 1.5
Natty:
Report link

Let me give some insights to each one of your questions:

Reasons:
  • Blacklisted phrase (1): this document
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: shiro

79592626

Date: 2025-04-25 13:32:16
Score: 1.5
Natty:
Report link

Example tested with UID

The thing is you should export UID variable and then it works

export UID=${UID}

Put in docker-compose file user: "${UID}"

docker compose up

...

profit

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Вячеслав

79592624

Date: 2025-04-25 13:31:16
Score: 0.5
Natty:
Report link

The renaming is applied always only to the topics. The consumer group names remain the same regardless the replication policy. When syncing the offsets, the topics are renamed according to the policy as well, but the group is not.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: Jakub

79592621

Date: 2025-04-25 13:31:16
Score: 1.5
Natty:
Report link

Based on what you've shared, I have 2 theories about what might be wrong.

  1. (Most likely) Since you didn't provide a full command output from inside container (i.e. curl vs curl ... | grep ...) I can assume that the grep version inside conatiner is working different than expected. This is usually happens with more complex commands (e.g. when using -E), but it worth checking a full piped pair.

  2. (Less likely) Weird idea, but maybe YAML itself is not resolved correctly? Try to make it as simple as possible to 2x check:

    startupProbe:
      exec:
        command: ["sh", "-c", "curl -s -f http://localhost:8080/v1/health | grep -q -e '\"status\":\"healthy\"'"]
    

If this doesn't work, try to make it verbose and check the Pod logs:

startupProbe:
  exec:
    command:
      - echo "PROBE DEBUG"
      - curl -v http://localhost:8080/v1/health
      - sh
      - -c
      - >
          curl http://localhost:8080/v1/health |
          grep -e '\"status\":\"healthy\"'
      - echo "$?"
Reasons:
  • RegEx Blacklisted phrase (1.5): resolved correctly?
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (0.5):
Posted by: mikalai

79592620

Date: 2025-04-25 13:30:16
Score: 3.5
Natty:
Report link

The answer can possibly be found here:
Altough this is wit solved the issue for my situation,
The strange case of Data source can’t be created with Reporting Services 2016 in Azure VM | Microsoft Learn

Reasons:
  • Probably link only (1):
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Van Heghe Eddy

79592618

Date: 2025-04-25 13:28:14
Score: 7.5 🚩
Natty: 5.5
Report link

have you find the answer for this problem ?

Reasons:
  • Blacklisted phrase (2): have you find
  • Low length (1.5):
  • No code block (0.5):
  • Ends in question mark (2):
  • Single line (0.5):
  • Low reputation (1):
Posted by: mostafa guellicha

79592616

Date: 2025-04-25 13:26:13
Score: 2
Natty:
Report link

based on the suggestions made above the following worked as required:

program | jq -r '[.mmsi, .rxtime, .speed, .lon, .lat] |@csv'

this also delivered practically the same result:

program | jq -r '[.mmsi, .rxtime, .speed, .lon, .lat] | join(",")'

Thanks for the many contributions

Reasons:
  • Blacklisted phrase (0.5): Thanks
  • Low length (0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: xuraax

79592615

Date: 2025-04-25 13:26:13
Score: 4
Natty:
Report link

https://learn.microsoft.com/en-us/answers/questions/2259532/azure-function-(event-grid-duckdb)-not-publishing?page=1&orderby=helpful&translated=false#message-latest

Answered here.

Now, I am able to resolve the issue by creating another Python function and using Pandas to convert Parquet to JSON data.

Reasons:
  • Probably link only (1):
  • Low length (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: user14099839

79592606

Date: 2025-04-25 13:23:12
Score: 0.5
Natty:
Report link

This works for replayed (not rebuilt builds) builds. The output is the build number of the original build from which your current build was replayed from:

def getReplayCauseNumber() {
    // This function is used to access the build number of the build from which a build was replayed from

    def cause = currentBuild.rawBuild.getCause(org.jenkinsci.plugins.workflow.cps.replay.ReplayCause)

    if (cause == null){
        return null
    }
    
    def originalNum = cause.getOriginalNumber()
    echo "This build was replayed from build #${originalNum}"
    return originalNum
    
}
Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: Xilef Blaytra

79592605

Date: 2025-04-25 13:23:12
Score: 2
Natty:
Report link

This worked for me nice solution :)

Reasons:
  • Whitelisted phrase (-1): This worked for me
  • Whitelisted phrase (-1): worked for me
  • Low length (2):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Ramam Rajdev

79592594

Date: 2025-04-25 13:18:10
Score: 10 🚩
Natty:
Report link

Im facing the same issue. Any luck with this?

Reasons:
  • Blacklisted phrase (1.5): Any luck
  • Low length (1.5):
  • No code block (0.5):
  • Me too answer (2.5): facing the same issue
  • Ends in question mark (2):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Sanath

79592592

Date: 2025-04-25 13:18:10
Score: 4.5
Natty: 6
Report link

se for apenas para monitoramento para um monitor pq não usa o Grafana ? bem mais simples e sem dor de cabeça .

Reasons:
  • Blacklisted phrase (1): não
  • Low length (1):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Sammir

79592583

Date: 2025-04-25 13:07:07
Score: 1.5
Natty:
Report link

S C:\Users\Maria\OneDrive\Documents\React Demo> npm start npm ERR! Missing script: "start" npm ERR! npm ERR! Did you mean one of these? npm ERR! npm star # Mark your favorite packages npm ERR! npm stars # View packages marked as favorites npm ERR! npm ERR! To see a list of scripts, run: npm ERR! npm run

npm ERR! A complete log of this run can be found in: npm ERR! C:\Users\Maria\AppData\Local\npm-cache_logs\2025-04-25T13_01_09_556Z-debug-0.log PS C:\Users\Maria\OneDrive\Documents\React Demo>

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: maria khan

79592582

Date: 2025-04-25 13:07:07
Score: 2
Natty:
Report link

Yes, it's possible, but stating that by itself is probably not very helpful.

For a practical demonstration how to do it, look at the code here: https://github.com/BartMassey/rolling-crc

...which is based on a forum discussion, archived here: https://web.archive.org/web/20161001160801/http://encode.ru/threads/1698-Fast-CRC-table-construction-and-rolling-CRC-hash-calculation

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • No code block (0.5):
Posted by: Maks Verver

79592566

Date: 2025-04-25 13:02:05
Score: 3
Natty:
Report link

If are using VS code, you can Right click on the file -> Apply Changes option. This will apply the changes of the file to your current working branch.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Gowtham Selvaraju

79592560

Date: 2025-04-25 12:58:04
Score: 1
Natty:
Report link

I'm trying to hide both the status bar and navigation bar using:

WindowInsetsControllerCompat(window, window.decorView)
    .hide(WindowInsetsCompat.Type.systemBars())

This works correctly only when my theme is:

<style name="AppTheme" parent="Theme.MaterialComponents.Light.DarkActionBar" />

But when I switch to a Material3 theme like:

<style name="Base.Theme.DaakiaTest" parent="Theme.Material3.Light.NoActionBar" />

...the navigation bar hides, but the status bar just becomes transparent with dark text, rather than fully hiding.

I'm already using:

WindowCompat.setDecorFitsSystemWindows(window, false)

I can’t switch back to the old MaterialComponents theme because my app uses Material3 components heavily, and switching would require large UI refactoring.

So my question is: Why does WindowInsetsControllerCompat.hide(WindowInsetsCompat.Type.statusBars()) not fully hide the status bar when using a Material3 theme?

Is there a workaround that allows full immersive mode with Theme.Material3.Light.NoActionBar?

Any guidance would be much appreciated!

Reasons:
  • Blacklisted phrase (1): appreciated
  • Long answer (-1):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: Ashif Ali

79592548

Date: 2025-04-25 12:53:02
Score: 1.5
Natty:
Report link

Instead of using a shared singleton, it's cleaner in Clean Architecture to pass the log object explicitly through the layers.

This way, the log stays tied to the request and avoids shared/global state, which fits Clean Architecture better.

Reasons:
  • No code block (0.5):
  • Low reputation (1):
Posted by: PSAU

79592543

Date: 2025-04-25 12:51:01
Score: 0.5
Natty:
Report link

To access services running on your host computer in the emulator, run adb -e reverse tcp:8080 tcp:8080. This will allow you to access it on 127.0.0.1:8080 in the emulator.

Adjust the protocol (here, TCP) and port (here, 8080) to your needs.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: FliegendeWurst

79592541

Date: 2025-04-25 12:49:58
Score: 6 🚩
Natty:
Report link

Have you add a button with submit type?

<MudButton ButtonType="ButtonType.Submit" Variant="Variant.Filled" Color="Color.Primary" Class="ml-auto">Register</MudButton>

I suggest you to see this link if you don't.

Reasons:
  • Blacklisted phrase (1): this link
  • RegEx Blacklisted phrase (1): see this link
  • Probably link only (1):
  • Low length (1):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Low reputation (0.5):
Posted by: Arash Yazdani

79592531

Date: 2025-04-25 12:41:56
Score: 5
Natty:
Report link

GCP TSE is here to help you with your situation 🤞.

  1. How can I restore the <number>[email protected] account?

You're right - as per Google Cloud Docs [1] you can't restore your Service Account (SA), because after 30 days, IAM permanently removes it.

  1. How can I configure the Firebase CLI to use a newly created or existing service account for Cloud Functions deployment instead of the deleted default?

Firebase CLI has several ways [2] to authenticate to API: using the Application Default Credentials (ADC) or using FIREBASE_TOKEN (considered legacy). You might have some kind of custom setup, but in general to authenticate Firebase CLI with a SA you should follow this simple guide [3]:

  1. Locate an existing SA or create a new one to be used by Firebase CLI;
  2. Grant all required roles to this SA (explained in [3], but you might need more narrow roles for your specific case);
  3. Setup ADC depending on your environment;
  4. Update the GOOGLE_APPLICATION_CREDENTIALS OS environment variable using gcloud auth application-default login or manually (depending on your dev environment). Details are in the linked docs.

[1] https://cloud.google.com/iam/docs/service-accounts-delete-undelete#undeleting
[2] https://firebase.google.com/docs/cli#cli-ci-systems
[3] https://firebase.google.com/docs/app-distribution/authenticate-service-account
[4] https://cloud.google.com/docs/authentication/provide-credentials-adc


If you haven't solved your problem using the above guide, please explain your deployment process stp-by-step. Also, try to answer as much as possible:

Reasons:
  • Blacklisted phrase (0.5): How can I
  • RegEx Blacklisted phrase (2.5): please explain your
  • Long answer (-1):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Looks like a comment (1):
  • Low reputation (0.5):
Posted by: mikalai

79592528

Date: 2025-04-25 12:37:55
Score: 3
Natty:
Report link

I created this, I do not know whether it can solve your problem:

TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 2 HOUR) --- now()

t.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: tünde

79592521

Date: 2025-04-25 12:35:54
Score: 2
Natty:
Report link

Seems like the issue was on the company antivirus side, that was only affecting FF. Activating the "allow all data uploads" option in the antivirus data loss prevention option resolved the issue.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Single line (0.5):
Posted by: bokkie

79592516

Date: 2025-04-25 12:33:53
Score: 2.5
Natty:
Report link

Great news, I think we have this figured out.

After a pipeline run in Azure navigate to Test Plans -> Runs

enter image description here

Then select the run you're looking for

enter image description here

Double Click on the run and you get the Run Summary page, now double the attachment

enter image description here

This can be opened in Visual Studio etc

enter image description here

And double clicking each test will show the steps etc in all their glory

enter image description here

Nice..

Reasons:
  • Probably link only (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: Kev

79592512

Date: 2025-04-25 12:29:52
Score: 0.5
Natty:
Report link

Instead of explicitly making each verse of lyrics in parallel (with the <<>> structure inside the \staff block), preceded the \staff block with consecutive \addlyrics blocks.


Inter-syllable hyphenation should be written with two dashes: --. These will be visible when the horizontal spacing allows, but disappear when the syllable are close together.

A single underscore _ can be used to skip a note for a melisma. Extender lines are typed with two underscores __.

\version "2.24.1"

\new Staff {
    \key e \major
    \time 3/4
    \relative c'' {
        e4 e8 cis \tuplet 3/2 { dis dis dis } |
        e8 e e2 |
        a8 a a a \tuplet 3/2 { gis gis a } |
    }   
}
\addlyrics { 
    Ci -- bo~e be -- van -- da di 
    vi -- _ ta, 
    bal -- sa -- mo, __ _ ves -- te, di
}
\addlyrics {
    Cris -- to __ _ Ver -- bo del 
    Pa -- _ dre, 
    re __ _ _ glo -- rio -- so fra
}

Image generated from above code, similar to the image in the question, but with some noticeable differences:  1. The melismas in the second verse are rendering.  2. Inter-syllable hyphens are smaller or unseen.  3. There is a key signature (E major) and time signature is (3/4).

Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Elements In Space

79592510

Date: 2025-04-25 12:28:52
Score: 1.5
Natty:
Report link

I ended up creating a regular script and just using gradlew like i would on the terminal on my local machine which worked as intended

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: Rashad.Z

79592508

Date: 2025-04-25 12:27:52
Score: 3.5
Natty:
Report link

Yes rather than using http use websocket for chat when there is change happening

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Mohamed ayman Elshenawy

79592501

Date: 2025-04-25 12:21:50
Score: 1.5
Natty:
Report link

The best thing to do would be to set up a dedicated /edit endpoint which accepts a unique identifier and only the fields you wish to edit. That way, if you POST to this endpoint with just a new description for example, you won't need to include all of the images in the POST request. You would simply update the Mongo document with the new description, rather than rewriting the entire thing.

Reasons:
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Danny V

79592499

Date: 2025-04-25 12:20:50
Score: 2.5
Natty:
Report link

How about using slices.Sort?

func (m Map) String() string {
    vs := []string{}
    for k, v := range m {
        vs = append(vs, fmt.Sprintf("%s:%s", k.String(), v.String()))
    }
    slices.Sort(vs)
    return fmt.Sprintf("{%s}", strings.Join(vs, ","))
}

Note for your Map that “If the key type is an interface type, [the comparison operators == and !=] must be defined for the dynamic key values; failure will cause a run-time panic.”

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Starts with a question (0.5): How
  • High reputation (-1):
Posted by: eik

79592498

Date: 2025-04-25 12:19:49
Score: 1
Natty:
Report link

CardDAV is used to distribute contacts and synchronize them between different devices using a central vCard repository. If you want to access full address books offline and have access to them from different devices CardDAV is the way to go.

LDAP is like a database which you can search for contact information. LDAP can be useful only when you rely mostly on contact search than having a local copy of the same, this can be particularly useful when there is a large collection of contacts but you need only a few at a time. LDAP is also useful when you do not want to expose all contacts in the address book to the user which is specially true in an enterprise. Direct LDAP access is generally not allowed in an organization or is allowed within WAN or via VPN.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Shubhra Prakash Nandi

79592494

Date: 2025-04-25 12:16:48
Score: 0.5
Natty:
Report link

Why is TypeScript not restricting the type alias to adhere to its structure?

In the following TypeScript code:

type User = [number, string];
const newUser: User = [112, "[email protected]"];

newUser[1] = "hc.com";     // ✅ Allowed
newUser.push(true);        // ⚠️ No error?!

I expected TypeScript to prevent newUser.push(true) since User is defined as a tuple of [number, string]. However, TypeScript allows this due to the mutable nature of tuples.

What's going on?

Tuples in TypeScript are essentially special arrays. At runtime, there's no real distinction between an array and a tuple — both are JavaScript arrays. Unless specified otherwise, tuples are mutable, and methods like .push() are available.

So newUser.push(true) compiles because:

TypeScript treats the tuple as an array.

.push() exists on arrays.

TypeScript doesn't strictly enforce the tuple's length or element types for mutations unless stricter typing is applied.

How to enforce stricter tuple rules

  1. Make the tuple readonly: This prevents any modification, including .push() and element reassignment.
type User = readonly [number, string];
  1. Use as const for literals: This creates an immutable tuple from the start.
const newUser = [112, "[email protected]"] as const;

This will infer the type as readonly [112, "[email protected]"] and block any mutation attempts.

Reasons:
  • Long answer (-1):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Self-answer (0.5):
  • Starts with a question (0.5): Why is
  • Low reputation (0.5):
Posted by: Radical Rosh

79592493

Date: 2025-04-25 12:14:48
Score: 2
Natty:
Report link

you have set `ssh_agent_auth` to true, have you started ssh agent in the machine where you are running your packer build.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
Posted by: Anshul Sharma

79592490

Date: 2025-04-25 12:12:47
Score: 0.5
Natty:
Report link

I had this error because I incorrectly followed the install instructions and put lazy.lua into ~/.config/nvim/config/ instead of ~/.config/nvim/lua/config. Your ~/.config/nvim directory tree should look like this:

.
├── init.lua
└── lua
    ├── config
    │   └── lazy.lua
    └── plugins.lua
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: statusfailed

79592487

Date: 2025-04-25 12:12:47
Score: 1.5
Natty:
Report link

Try to use packer data source, it can download libs/tools for you and keep it ready for your source block. It can be used as pre-population values from the web to be used in packer image building.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
Posted by: Anshul Sharma

79592478

Date: 2025-04-25 12:06:46
Score: 1
Natty:
Report link

Wonder if you have looked into Azure Content Safety, it has a few ways you could configure the level of content safety. the content safety feature could not by turned off/disabled by yourself directly.

This content filtering system is powered by Azure AI Content Safety, and it works by running both the prompt input and completion output through an ensemble of classification models aimed at detecting and preventing the output of harmful content. https://learn.microsoft.com/en-us/azure/ai-foundry/concepts/content-filtering

enter image description here

enter image description here

If you really find the Content Safety is causing unexpected result for your use case and you are a managed Azure customer, you can request de-activation of the content filtering in your subscription by the following online form: https://ncv.microsoft.com/uEfCgnITdR (Azure OpenAI Limited Access Review:  Modified Content Filtering)

https://learn.microsoft.com/en-us/answers/questions/2110040/how-can-i-disable-the-content-filter-in-azure-open

Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • No code block (0.5):
Posted by: qkfang

79592477

Date: 2025-04-25 12:06:46
Score: 1.5
Natty:
Report link

screenshot on Macbook

  1. Open Android Studio > Settings (⌘ ,)

  2. Go to Tools > Device Mirroring

  3. Tick both:

    • Activate mirroring when a new physical device is connected

    • Activate mirroring when launching an app on a physical device

  4. Click Apply and OK

  5. Connect your Android phone via USB (enable USB debugging)

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Oluwaseun Oyewale

79592467

Date: 2025-04-25 11:58:43
Score: 2.5
Natty:
Report link
Hi. In the end, you couldn't find a solution? We faced the same problem
Reasons:
  • Low length (1.5):
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Stillwaters

79592466

Date: 2025-04-25 11:58:43
Score: 1.5
Natty:
Report link

WebStorm v2025.2

You can find the changes using the Command + 0 shortcut or by clicking the icon in the side menu.

changes screenshot

If you prefer to have the Changes tab at the bottom (as it was before), go to:
Settings → Advanced Settings → Version Control
and disable "Open Diff as Editor Tab."
settings screenshot

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Alex

79592462

Date: 2025-04-25 11:57:43
Score: 0.5
Natty:
Report link

I think it's impossible. The MediaCodec resources are shared among all applications in the system, so the system cannot guarantee that your upcoming MediaCodec creation will succeed even if it appears that resources are currently available — another application may create a MediaCodec in the meantime. Moreover, the creation of a MediaCodec mainly depends on the vendor's implementation. Therefore, aside from actually attempting to create a MediaCodec to see if it succeeds, there's no way to determine in advance whether the creation will be successful.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: wangpan

79592461

Date: 2025-04-25 11:56:43
Score: 3.5
Natty:
Report link

the problem seems to be in the parameter passed in the stored procedure

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Henil Mamaniya

79592456

Date: 2025-04-25 11:54:42
Score: 1
Natty:
Report link

The standard C files are already compiled and are part of the stdc++ library and other libraries linked to it.
In my case, it was there in
/usr/lib/x86_64-linux-gnu/libstdc++.so.6 /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30.

A sample test to check whether a .so contains a function or not.
I just checked whether printf is present in this libstdc++.so.6.
readelf -a libstdc++.so.6 | grep printf

000000226468 001f00000007 R_X86_64_JUMP_SLO 0000000000000000 __fprintf_chk@GLIBC_2.3.4 + 0

000000226ec0 005b00000007 R_X86_64_JUMP_SLO 0000000000000000 sprintf@GLIBC_2.2.5 + 0

000000227448 007900000007 R_X86_64_JUMP_SLO 0000000000000000 vsnprintf@GLIBC_2.2.5 + 0

000000227bb8 009f00000007 R_X86_64_JUMP_SLO 0000000000000000 __sprintf_chk@GLIBC_2.3.4 + 0

Each gcc version has a corresponding version of libstdc++.so of it , hence why you cannot run a executable built with higher version of gcc run in lower version of it. It misses the runtime symbols required for it.

Hope it answers your question.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Filler text (0.5): 0000000000000000
  • Filler text (0): 0000000000000000
  • Filler text (0): 0000000000000000
  • Filler text (0): 0000000000000000
  • Low reputation (1):
Posted by: sairaman g

79592448

Date: 2025-04-25 11:47:40
Score: 3.5
Natty:
Report link

select ((select count(*) b4 from tblA)-(select count(*) after from tblB) );

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: user23599136

79592445

Date: 2025-04-25 11:46:39
Score: 2
Natty:
Report link

If you are using Flutter like me and you just want to create a new release without running the project then just run flutter clean and after this run flutter pub get to install the dependencies and then install pods using cd ios && pod install && cd .. and you should be good to go.

If it's still not working try to restart the X-Code, clean the X-Code cache using CMD+SHIFT+C and you should be good to go.

Reasons:
  • Blacklisted phrase (2): still not working
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Ali Hussnain - alichampion

79592440

Date: 2025-04-25 11:43:39
Score: 1
Natty:
Report link

In my case I was installing SQL Server 2022 Developer and I received the same error about missing msoledbsql.msi. I found this file in the setup package (in my case in "C:\SQL2022\Developer_ENU\1033_ENU_LP\x64\Setup\x64\msoledbsql.msi"). I tried to run it manually and I received error message, that a higher version is already installed, so I downloaded an newer version, than I had installed in the system and substitued the file in the setup package with the downloaded file. Then I rerun the installation and it succeeded.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: sebetovsky

79592437

Date: 2025-04-25 11:41:37
Score: 10 🚩
Natty: 4.5
Report link

did you find the answer? I'm also facing the same issue.

Reasons:
  • RegEx Blacklisted phrase (3): did you find the answer
  • Low length (1.5):
  • No code block (0.5):
  • Me too answer (2.5): I'm also facing the same issue
  • Contains question mark (0.5):
  • Single line (0.5):
  • Starts with a question (0.5): did you find the answer
  • Low reputation (1):
Posted by: Kishore Murali

79592434

Date: 2025-04-25 11:41:37
Score: 3
Natty:
Report link

Not in this case, but just in case you forgot to Open Powershell as Administrator, this same error can happen.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: thedarksage

79592429

Date: 2025-04-25 11:38:36
Score: 4
Natty:
Report link

look at the body

i think i helped you

Reasons:
  • Low length (2):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: LikWer

79592428

Date: 2025-04-25 11:38:36
Score: 0.5
Natty:
Report link

I've phpMyAdmin v5.2.2 (25-Apr-2025)
Adding the following config snippet in C:\wamp64\apps\phpmyadmin5.2.2\config.inc.php worked!

If anyone could put some light in the comment, I'm not sure how a particular column is picked to display there. Even though I have multiple string columns there, tried reordering as well, but it chose title over content

// ...

$cfg['Servers'][$i]['pmadb']         = 'phpmyadmin';      // Your pmadb name
$cfg['Servers'][$i]['relation']      = 'pma__relation';   // Relation table
$cfg['Servers'][$i]['table_info']    = 'pma__table_info'; // Display-column
$cfg['Servers'][$i]['column_info']   = 'pma__column_info';// Column transformation info

/* End of servers configuration */
?>

Edit Mode

enter image description here

Hover state

Snip of phpmyadmin foreign key hover text

Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Hardik

79592427

Date: 2025-04-25 11:37:35
Score: 0.5
Natty:
Report link

I finally managed to get it.

At the beginning of my test file:

import sys


class FakeGlobalConfig:
    def __init__(self):
        self.ProjectName = ""


class FakeSettings:
    def __init__(self):
        self.global_config = FakeGlobalConfig()


import project.Settings

sys.modules["project.Settings"].Settings = FakeSettings

That's been placed at the very beginning, before anything else.

With that, we override the real `Settings` class and set the attributes we need.

Reasons:
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: Trauma

79592418

Date: 2025-04-25 11:33:34
Score: 1.5
Natty:
Report link

Please reinstall the WooCommerce plugin.

You can do this manually by going to the "Plugins" section in your WordPress dashboard, clicking on "Add New," and then uploading the WooCommerce plugin ZIP file.

Some of your plugin files are missing, which may be due to an incomplete update

Incorrect file permissions set for WooCommerce files.

Make sure the referenced file has proper permissions (read and write for the owner, read-only for others), which is usually set to 644.

Reasons:
  • No code block (0.5):
  • Low reputation (1):
Posted by: Ali

79592417

Date: 2025-04-25 11:30:34
Score: 0.5
Natty:
Report link

Use the following command:

docker run -p 60000:60000 -v C:\Utils\Opserver-main\Config:/app/Config --rm -it (docker build -q .)

Your command only works in WSL or Git Bash, but not in Command Prompt or PowerShell.

Explanation: Use PowerShell () syntax instead of $()

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Omar

79592413

Date: 2025-04-25 11:27:33
Score: 1.5
Natty:
Report link

You can solve the problem when adding to the header from bokeh import LogScale. And add a line to your code: p3.extra_y_scales = {"log": LogScale()}

You can also have a glance to this post:

Twin Axis with linear and logarithmic scale using bokeh plot Python

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Swawa

79592411

Date: 2025-04-25 11:26:33
Score: 1
Natty:
Report link

None of the above worked for me, i just did:

rm -rf ios/Pods

rm -rf ios/Podfile.lock

flutter clean

flutter pub get

cd ios

pod install --repo-update

cd ..

flutter run

Reasons:
  • Whitelisted phrase (-1): worked for me
  • Low length (1):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Onalo Joseph

79592406

Date: 2025-04-25 11:24:32
Score: 4.5
Natty:
Report link

We were finally able to resolve this by implementing AccessTokenCallback for the sql connection: https://learn.microsoft.com/en-us/dotnet/api/microsoft.data.sqlclient.sqlconnection.accesstokencallback?view=sqlclient-dotnet-standard-5.2#microsoft-data-sqlclient-sqlconnection-accesstokencallback

Reasons:
  • Probably link only (1):
  • Low length (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Nikita Christie

79592389

Date: 2025-04-25 11:17:29
Score: 1
Natty:
Report link

This what happens when you include #include <stdio.h> (adds input/ output functions like printf) ,for memory management #include <stdlib>, string manupulation #include <string.h> etc... you tell the compiler to copy the declarations for printf , scanf etc... those functions are in the header files such as stdio.h , stdlib etc... the code for these files are already complied they are part of GNU C library.

you try out verbose where these files located

gcc -v your_program.c -o your_program

the output would look like this

$ gcc -v test.c -o test.o 
Using built-in specs. 
COLLECT_GCC=gcc 
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-linux-gnu/13/lto-wrapper 
OFFLOAD_TARGET_NAMES=nvptx-none:amdgcn-amdhsa 
OFFLOAD_TARGET_DEFAULT=1 
Target: x86_64-linux-gnu 
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 13.3.0-6ubuntu2~24.04' --with-bugurl=file:///usr/share/doc/gcc-13/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,m2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-13 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/libexec --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-libstdcxx-backtrace --enable-gnu-unique-object --disable-vtable-verify --enable-plugin --enable-default-pie --with-system-zlib --enable-libphobos-checking=release --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --disable-werror --enable-cet --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none=/build/gcc-13-fG75Ri/gcc-13-13.3.0/debian/tmp-nvptx/usr,amdgcn-amdhsa=/build/gcc-13-fG75Ri/gcc-13-13.3.0/debian/tmp-gcn/usr --enable-offload-defaulted --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu --with-build-config=bootstrap-lto-lean --enable-link-serialization=2 
Thread model: posix 
Supported LTO compression algorithms: zlib zstd 
gcc version 13.3.0 (Ubuntu 13.3.0-6ubuntu2~24.04)  
COLLECT_GCC_OPTIONS='-v' '-o' 'test.o' '-mtune=generic' '-march=x86-64' '-dumpdir' 'test.o-' 
/usr/libexec/gcc/x86_64-linux-gnu/13/cc1 -quiet -v -imultiarch x86_64-linux-gnu test.c -quiet -dumpdir test.o- -dumpbase test.c -dumpbase-ext .c -mtune=generic -march=x86-64 -version -fasynchronous-unwind-tables -fstack-protector-strong -Wformat -Wformat-security -fstack-clash-protection -fcf-protection -o /tmp/ccRbksbn.s
GNU C17 (Ubuntu 13.3.0-6ubuntu2~24.04) version 13.3.0 (x86_64-linux-gnu) 
    compiled by GNU C version 13.3.0, GMP version 6.3.0, MPFR version 4.2.1, MPC version 1.3.1, isl version isl-0.26-GMP 
 
GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072 
ignoring nonexistent directory "/usr/local/include/x86_64-linux-gnu" 
ignoring nonexistent directory "/usr/lib/gcc/x86_64-linux-gnu/13/include-fixed/x86_64-linux-gnu" 
ignoring nonexistent directory "/usr/lib/gcc/x86_64-linux-gnu/13/include-fixed" 
ignoring nonexistent directory "/usr/lib/gcc/x86_64-linux-gnu/13/../../../../x86_64-linux-gnu/include" 
#include "..." search starts here: 
#include <...> search starts here: 
/usr/lib/gcc/x86_64-linux-gnu/13/include
/usr/local/include
/usr/include/x86_64-linux-gnu
/usr/include
End of search list. 
Compiler executable checksum: 38987c28e967c64056a6454abdef726e 
COLLECT_GCC_OPTIONS='-v' '-o' 'test.o' '-mtune=generic' '-march=x86-64' '-dumpdir' 'test.o-' 
as -v --64 -o /tmp/ccwriUWR.o /tmp/ccRbksbn.s
GNU assembler version 2.42 (x86_64-linux-gnu) using BFD version (GNU Binutils for Ubuntu) 2.42 
COMPILER_PATH=/usr/libexec/gcc/x86_64-linux-gnu/13/:/usr/libexec/gcc/x86_64-linux-gnu/13/:/usr/libexec/gcc/x86_64-linux-gnu/:/usr/lib/gcc/x86_64-linux-gnu/13/:/usr/lib/gcc/x86_64-linux-gnu/ 
LIBRARY_PATH=/usr/lib/gcc/x86_64-linux-gnu/13/:/usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu/:/usr/lib/gcc/x86_64-linux-gnu/13/../../../../lib/:/lib/x86_64-linux-gnu/:/lib/../lib/:/usr/lib/x86_64-linux-gnu/:/usr/lib/../lib/:/usr/lib/gcc/x86_64-linux-gnu/13/../../../:/lib/:/usr/lib/ 
COLLECT_GCC_OPTIONS='-v' '-o' 'test.o' '-mtune=generic' '-march=x86-64' '-dumpdir' 'test.o.' 
/usr/libexec/gcc/x86_64-linux-gnu/13/collect2 -plugin /usr/libexec/gcc/x86_64-linux-gnu/13/liblto_plugin.so -plugin-opt=/usr/libexec/gcc/x86_64-linux-gnu/13/lto-wrapper -plugin-opt=-fresolution=/tmp/ccO6Fe7I.res -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s --build-id --eh-frame-hdr -m elf_x86_64 --hash-style=gnu --as-needed -dynamic-linker /lib64/ld-linux-x86-64.so.2 -pie -z now -z relro -o test.o /usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu/Scrt1.o /usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu/crti.o /usr/lib/gcc/x86_64-linux-gnu/13/crtbeginS.o -L/usr/lib/gcc/x86_64-linux-gnu/13 -L/usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu -L/usr/lib/gcc/x86_64-linux-gnu/13/../../../../lib -L/lib/x86_64-linux-gnu -L/lib/../lib -L/usr/lib/x86_64-linux-gnu -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/13/../../.. /tmp/ccwriUWR.o -lgcc --push-state --as-needed -lgcc_s --pop-state -lc -lgcc --push-state --as-needed -lgcc_s --pop-state /usr/lib/gcc/x86_64-linux-gnu/13/crtendS.o /usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu/crtn.o
COLLECT_GCC_OPTIONS='-v' '-o' 'test.o' '-mtune=generic' '-march=x86-64' '-dumpdir' 'test.o.'

here you can see three phases:

  1. Compilation of your .c files into .o object files, via calls to cc1 and as.

  2. Linking of those object files together plus the startup files (Scrt1.o, crti.o, crtn.o) and the pre‐built C runtime libraries (GCC’s support libraries and the C standard library).

  3. Result is your final executable.

In your verbose dump the key line is buried in the collect2/ld invocation:

… -plugin-opt=-pass-through=-lc … -lc …

That -lc is the linker flag that tells it:

“Pull in the C standard library (libc), which already contains the compiled code for printf, fopen, etc.”

You do not compile stdio.c (or any of the .c sources of glibc) yourself. The C library ships as pre-compiled archives (libc.a) and shared objects (libc.so), and GCC drivers automatically pass -lc at link time so that all your <stdio.h> declarations get resolved to real code in libc.

a good read would be https://www.gnu.org/software/libc/manual/html_node/Header-Files.html and How does the compilation/linking process work?

Reasons:
  • Long answer (-1):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Low reputation (0.5):
Posted by: Hermit

79592386

Date: 2025-04-25 11:16:28
Score: 3.5
Natty:
Report link

Good Day! Here's my Wing Bank Account details:

Account Number: 086222216

Currency: USD

Account Holder Name: Pov Heang

Reasons:
  • Blacklisted phrase (1): Good Day
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Heang Pov

79592374

Date: 2025-04-25 11:08:26
Score: 5
Natty: 7
Report link

Comment puis-je supprimer le marqueur précédent en cliquant à nouveau et afficher uniquement le dernier marqueur ?

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Ends in question mark (2):
  • Single line (0.5):
  • Low reputation (1):
Posted by: A Fakhimi

79592366

Date: 2025-04-25 11:04:25
Score: 4
Natty:
Report link

I don't seem to be able to reproduce the results, can anyone let me know what I'm missing?

import pandas as pd
import datetime
data = [
    [datetime.datetime(1970, 1, 1, 0, 0), 262.933],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 76923), 261.482],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 153846), 260.394],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 230769), 259.306],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 307692), 258.218],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 384615), 257.311],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 461538), 256.223],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 538461), 255.135],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 615384), 254.047],
    [datetime.datetime(1970, 1, 1, 0, 0, 0, 692307), 253.141],
]
df = pd.DataFrame(data, columns=["timestamp", "x"])
new_date_range = pd.date_range(datetime.datetime(1970, 1, 1, 0, 0), datetime.datetime(1970, 1, 1, 0, 0, 0, 692307), freq="100ms")
df.set_index("timestamp").reindex(new_date_range).interpolate().reset_index()\
# Output as below, but would expect x to vary...
                    index        x
0 1970-01-01 00:00:00.000  262.933
1 1970-01-01 00:00:00.100  262.933
2 1970-01-01 00:00:00.200  262.933
3 1970-01-01 00:00:00.300  262.933
4 1970-01-01 00:00:00.400  262.933
5 1970-01-01 00:00:00.500  262.933
6 1970-01-01 00:00:00.600  262.933
Reasons:
  • RegEx Blacklisted phrase (2.5): can anyone let me know what
  • Long answer (-1):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Low reputation (1):
Posted by: tomasito

79592351

Date: 2025-04-25 10:53:22
Score: 1
Natty:
Report link

You can achieve this in Informatica IICS using Expression transformation. Try using this logic:

REPLACESTR(1, SUBSTR(your_field, 1, LENGTH(your_field) - INSTR(REVERSE(your_field), ' ')), ' ', '' ) || '.' || SUBSTR(your_field, LENGTH(your_field) - INSTR(REVERSE(your_field), ' ') + 2)

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Компания KAZХИМИЯ

79592345

Date: 2025-04-25 10:48:21
Score: 0.5
Natty:
Report link

This issue exists because in version 0.2.1.post1 only arm32 support was added. The developers ofkaleido chose not to publish the wheels for other architectures as they were not changed (see).

You can run uv add kaleido==0.2.1 to install the latest version on any other architecture.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: be_cracked

79592337

Date: 2025-04-25 10:43:20
Score: 2
Natty:
Report link

Due to security reasons, ngrok does not accept connections from new clients unless you give consent first. If you open the ngrok URL from any device, it will first alert telling you the risks, safaricom cannot approve the consent and that is why the requests fail.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Albert Alberto

79592333

Date: 2025-04-25 10:39:19
Score: 2.5
Natty:
Report link

Intent intent = new Intent(Intent.ACTION_CALL);

intent.setData(Uri.parse("tel:" + phoneNumber));

startActivity(intent);

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Mahamudul Mondal

79592321

Date: 2025-04-25 10:29:16
Score: 3
Natty:
Report link

The email field on the Firebase database for the documents with error had a key of "email." instead of "email". This can be spotted by printing each field individually to spot the error.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: michael agyo

79592311

Date: 2025-04-25 10:24:15
Score: 1
Natty:
Report link

For an AWS SAM template, I used a CloudFormation condition like so:

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Example

Parameters:
  Environment:
    Type: String
    Description: Deployment environment
    Default: prd

Conditions:
  IsProd:
    Fn::Equals:
      - !Ref Environment
      - prd

Resources:
  ExampleScheduledFunction:
    Type: AWS::Serverless::Function
    Properties:
      Events:
        ScheduleEvent:
          Type: ScheduleV2
          Properties:
            ScheduleExpression: "cron(0 3 ? * MON *)"
            State:
              Fn::If:
                - IsProd
                - ENABLED
                - DISABLED

Reference:

Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Zoey

79592306

Date: 2025-04-25 10:22:14
Score: 1
Natty:
Report link
    const serverRenderPaths =["/docs","/dashboard"]
    if(!serverRenderPaths.includes(window.location.pathname))
{
ReactDom.createRoot(document.getElementById("root")).render(
    <App>);
    }

By this way you can exclude particular routes

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Vijayasuriyan V

79592300

Date: 2025-04-25 10:19:09
Score: 7.5 🚩
Natty:
Report link

I have same issues with my azure devops post call
POST: https://dev.azure.com/$organization/$project/\_apis/test/Runs/$runID/results?api-version=7.1-preview.3

Body:

{
    "results":[
                {"durationInMs":469.0,"automatedTestType":"UnitTest","testCase":{"id":2233},"state":"Completed","outcome":"Passed","automatedTestName":"[TC-2233]My_Login.","automatedTestStorage":"MYTest.postman_collection.json"},
                {"durationInMs":384.0,"automatedTestType":"UnitTest","testCase":{"id":3240},"state":"Completed","outcome":"Passed","automatedTestName":"[TC-3240] My_Alerts","automatedTestStorage":"MYTest.postman_collection.json"}
            ]
}

But Response Body: 400 bad request

{
    "$id": "1",
    "innerException": null,
    "message": "Value cannot be null.\r\nParameter name: resultCreateModel",
    "typeName": "System.ArgumentNullException, mscorlib",
    "typeKey": "ArgumentNullException",
    "errorCode": 0,
    "eventId": 0
}

Can someone help me with this

Reasons:
  • Blacklisted phrase (1): help me
  • RegEx Blacklisted phrase (3): Can someone help me
  • Probably link only (1):
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Me too answer (2.5): I have same issue
  • Low reputation (1):
Posted by: Khushboo Kumari

79592299

Date: 2025-04-25 10:19:09
Score: 1
Natty:
Report link

I had this same issue and nothing in the answers worked for me. All I did was, delete my existing storage bucket from firebase console, clicked on Get Started again, then when prompted to choose location, i unselected the "no cost location" and picked a location from "All locations", set it up, ran firebase init again

Reasons:
  • Whitelisted phrase (-1): worked for me
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: marshal moses

79592294

Date: 2025-04-25 10:17:08
Score: 0.5
Natty:
Report link

Regarding the output shape of your YOLOv8 detection model being `(1, 7, 8400)` for 3 classes, instead of perhaps what you might have expected, this is actually the **correct and expected raw output format** for YOLOv8 before post-processing.

Let's break down the meaning of this shape:

Contrast this with the standard YOLOv8 detection model (trained on 80 COCO classes), whose raw detection output shape is typically (1, 84, 8400). Here, `84` also follows the same pattern: `80 (number of classes) + 4 (bounding box parameters) = 84`. This further confirms that the output dimension structure is "number of classes + 4".

This (1, 7, 8400) tensor is the raw prediction result generated by the YOLOv8 model after the network layers. It still needs to go through **post-processing steps**, such as confidence thresholding and Non-Maximum Suppression (NMS), to obtain the final list of detected bounding boxes (e.g., each detection including location, confidence, class ID, etc.). The final detection results you typically work with are the output after these post-processing steps, not this raw (1, 7, 8400) tensor itself.

Please note that within the YOLOv8 model family, the output shapes for different tasks (such as detection vs. segmentation) are different. For example, the output of a YOLOv8 segmentation model (like YOLOv8n-seg) might include a tensor with a shape like (1, 116, 8400) (combining classes, box parameters, and mask coefficients) and another output for prototype masks. This also illustrates that the output shape structure is determined by the specific task and configuration of the model.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: MDR

79592292

Date: 2025-04-25 10:14:07
Score: 6 🚩
Natty: 5.5
Report link

have u found a working solution?

Reasons:
  • Low length (2):
  • No code block (0.5):
  • Ends in question mark (2):
  • Single line (0.5):
  • Low reputation (1):
Posted by: user3505896

79592291

Date: 2025-04-25 10:13:06
Score: 0.5
Natty:
Report link

Since as far as I could find (and based on the lack of responses) it seems like there is not a way lua filters can do this, I decided to solve this issue with Python and mark this as solved.

The workaround I could find is:

The code I used is provided below. Maybe someone finds a way to do something like this within pandoc, but as for now, this effectively solves my problem :)

import os
import re

import pypandoc

# Pre-processes a Gitlab-flavored Markdown file such that
#   - ::include directives are replaced by the actual file
#   - [[_TOC_]]

# Requires pandoc!!!
# See https://pypi.org/project/pypandoc/

pandoc_location = r'<pandoc_folder>\pandoc.exe'
input_file = r'<path_to_your_file.md>'
to_format = 'html5'

print(f'Setting pandoc location to {pandoc_location}')
os.environ.setdefault('PYPANDOC_PANDOC', pandoc_location)

current_path = __file__
current_folder, current_filename = os.path.split(current_path)
tmp_file = os.path.join(current_folder, 'tmp.md')
print(f'Using tmp. file {tmp_file}')

with open(input_file, 'r') as f:
    input_md = f.read()

print(f'Read {input_file}. Length={len(input_md)}')

input_folder, input_file = os.path.split(input_file)
input_base, input_ext = os.path.splitext(input_file)

all_matches = [re.match(r'\:\:include{file=([\W\w\.\/\d]+)}', e) for e in input_md.splitlines() ]
all_matches = [e for e in all_matches if e is not None]
for include_match in all_matches:
    include_path = include_match.group(1)
    abs_path = os.path.abspath(os.path.join(input_folder, include_path))
    print(f'Including {abs_path}')
    try:
        with open(abs_path, 'r') as f:
            include_file_content = f.read()
        input_md = input_md.replace(include_match.group(0), include_file_content)
    except Exception as e:
        print(f'Could not include file: {e}')

# Process ToC
def slugify(text):
    """Converts heading text into a GitHub-style anchor slug."""
    text = text.strip().lower()
    text = re.sub(r'[^\w\s-]', '', text)
    return re.sub(r'[\s]+', '-', text)

def strip_markdown_links(text):
    """Extracts visible text from markdown-style links [text](url)."""
    return re.sub(r'\[([^\]]+)\]\([^)]+\)', r'\1', text)

def extract_headings(markdown):
    """Extracts headings ignoring code blocks, and handles markdown links."""
    headings = []
    in_code_block = False

    for line in markdown.splitlines():
        if line.strip().startswith("```"):
            in_code_block = not in_code_block
            continue
        if in_code_block:
            continue

        match = re.match(r'^(#{1,6})\s+(.*)', line)
        if match:
            level = len(match.group(1))
            raw_text = match.group(2).strip()
            clean_text = strip_markdown_links(raw_text)
            slug = slugify(clean_text)
            headings.append((level, clean_text, slug))

    return headings

def generate_toc(headings):
    """Generates TOC from extracted headings."""
    toc_lines = []
    for level, text, slug in headings:
        indent = '  ' * (level - 1)
        toc_lines.append(f"{indent}- [{text}](#{slug})")
    return '\n'.join(toc_lines)

# Replace Gitlab's [[_TOC_]] with the actual ToC
print(f'Generating ToC from [[_TOC_]]')
headings_input = extract_headings(input_md)
toc = generate_toc(headings_input)

# The HTML output seems NOT to like it if the anchor is "#3gppsa2".
# The number "3" is lost in the HTML conversion. This should remedy this
# Please note that this "hack" results in the navigation of tmp.md being broken. But the output HTML is OK
toc = toc.replace('(#3gppsa2', '(#gppsa2')

input_md = input_md.replace('[[_TOC_]]', toc)

with open(tmp_file, 'w') as f:
    f.write(input_md)
print(f'Wrote {tmp_file}')

print(f'Converting {tmp_file} to {to_format}')
# CSS from https://jez.io/pandoc-markdown-css-theme/#usage
# https://github.com/jez/pandoc-markdown-css-theme
# Fixed title with https://stackoverflow.com/questions/63928077/how-can-i-add-header-metadata-without-adding-the-h1
# Using markdon-smart to fix wrongly-displayed single-quotes
output = pypandoc.convert_file(
    source_file='tmp.md',
    to=f'{to_format}',
    extra_args=[
        '--from=markdown-smart',
        '--standalone',
        '--embed-resources=true',
        '--css=theme.css',
        '--html-q-tags=true',
        f'--metadata=title={input_base}',
        '--variable=title='
    ])

match to_format:
    case 'html' | 'html5':
        output_ext = 'html'
    case _:
        output_ext = to_format

output_file = os.path.join(input_folder, f'{input_base}.{output_ext}')

with open(output_file, 'w') as f:
    f.write(output)
print(f'PyPandoc output saved to: {output_file}')
Reasons:
  • Blacklisted phrase (1): stackoverflow
  • Long answer (-1):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: Josep

79592287

Date: 2025-04-25 10:10:06
Score: 3.5
Natty:
Report link

I don't know if you have found a solution, but for anyone who stumbled upon this question looking for an answer, try to send the header as 'Authorization: JWT your_token'

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Itachi

79592280

Date: 2025-04-25 10:06:04
Score: 0.5
Natty:
Report link

While I don't know memgraph or their particular implementation of openCypher, I might at least be able to give some potential insight regarding:

that an exists() can only take one relationship (which I thought I'd comply with)

I believe that the WHERE part in

exists((c) <-[:contains]- (center) WHERE center.name CONTAINS "special")

might be the issue, as that is something more than just a relationship.

This is based on my experience with Neo4j and their Cypher though so it might differ from memgraph, but it would be my guess at least.

As a though experiment: would it be possible to calculate all the values or at least conditions separately to the SET, to split the SET and exists call? for example calculate something in one WITH clause and use that in the SET afterwards

Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: Therese

79592278

Date: 2025-04-25 10:05:04
Score: 0.5
Natty:
Report link

Try the following code and see if it works;

@media print
{
  header, footer
  {
   display: none;
  }
}
Reasons:
  • Whitelisted phrase (-1): Try the following
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Speed Ack

79592267

Date: 2025-04-25 10:00:03
Score: 1.5
Natty:
Report link

Thanks for all those details. I just had a look at your Flow, and you need to either:

Right now you've defined "error_messages" in "data", but you are not making use of it.

Reasons:
  • Blacklisted phrase (0.5): Thanks
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Diego

79592263

Date: 2025-04-25 09:58:02
Score: 1
Natty:
Report link

I have the same setup. The problem is pkginfo. I updated to version 1.12.1.2 and it fixed my problem.

pip install --upgrade pkginfo

Hopefully the twine update will come soon

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Sequestered1776Vexer

79592248

Date: 2025-04-25 09:49:00
Score: 0.5
Natty:
Report link

For more modern C# (from version 6), you can simply use string interpolation.

Console.WriteLine($"{5488461193L:X}");

This would also work for assigning variables, etc:

var octalLong = $"{5488461193L:X}";
Reasons:
  • Low length (1):
  • Has code block (-0.5):
Posted by: nobody

79592246

Date: 2025-04-25 09:47:59
Score: 1
Natty:
Report link

I managed to work around the issue by passing the below config parameters to the boto3 client:

import boto3
from botocore.config import Config

bedrock_client = boto3.client(
            'bedrock-runtime',
            config=Config(retries={'max_attempts': 5, 'mode': 'adaptive'})
        )
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Pascal Wyler

79592240

Date: 2025-04-25 09:44:59
Score: 3.5
Natty:
Report link

Basically with the help of @Paulw11 the following I did:

Once that's added and configured it will download the profile and will load in the simulator, it also able to show the Azure B2C login.

Reasons:
  • Probably link only (1):
  • No code block (0.5):
  • User mentioned (1): @Paulw11
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: Lóri Nóda

79592239

Date: 2025-04-25 09:44:58
Score: 6.5 🚩
Natty: 5.5
Report link

I have the same problem, I upgraded spring boot 3.3.4 to 3.4.3. , but my mapping is different, so the solution with CascadeType.ALL don't work :


public class Parent {
    private Long idParent;
    @OneToMany(cascade=CascadeType.REMOVE)
    @JoinColumn(name="id_parent")
    private List<child> parent = new ArrayList<>();
}

public class Child {
    @Column(name = "id_parent")
    private Long idParent;
}

I have the same problem with :


Child child = childdDao.findById(idFromFront);
Parent parent =parentDao.findById(child.getIdParent());
...some check on parent
childDao.deleteById(idChild);

The only solution found is to do "entityManager.clear();" before delete :


Child child = childdDao.findById(idFromFront);
Parent parent =parentDao.findById(child.getIdParent());
...some check on parent
entityManager.clear();
childDao.deleteById(idChild);

???

Reasons:
  • Blacklisted phrase (1): I have the same problem
  • Blacklisted phrase (1): ???
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Me too answer (2.5): I have the same problem
  • Ends in question mark (2):
  • Low reputation (1):
Posted by: Simon

79592233

Date: 2025-04-25 09:41:57
Score: 1
Natty:
Report link

You can change the background color with the navBarBuilder. Thanks

navBarBuilder: (navBarConfig) => Style5BottomNavBar(
            navBarConfig: navBarConfig,
            navBarDecoration: const NavBarDecoration(
              color: Colors.black,
            ),
          ),
Reasons:
  • Blacklisted phrase (0.5): Thanks
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Artur

79592231

Date: 2025-04-25 09:40:57
Score: 1.5
Natty:
Report link

There is not really this "one" specification but rather a list of them. A very good source is still this book and for your question this chapter: https://books.sonatype.com/mvnex-book/reference/simple-project-sect-simple-core.html#:\~:text=Maven%20coordinates%20define%20a%20set,look%20at%20the%20following%20POM.&text=We've%20highlighted%20the%20Maven,%2C%20artifactId%20%2C%20version%20and%20packaging%20.
In general, in case of artifact identity, think more in the repository path layout that is created. This is based on literal string values and not abstract versions.

ComparableVersion is used for sorting version and version ranges, but they won't be resolved as the same artifact. As a test, create these artifacts with different numbers yourself and then look at your local repository (https://maven.apache.org/repository/layout.html). You will discover the different versions in different folders.

Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Mr__Steel

79592227

Date: 2025-04-25 09:38:56
Score: 5
Natty:
Report link

Follow this link to learn how to install and download plugin.

Reasons:
  • Blacklisted phrase (1): this link
  • Low length (1.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Alex