79775630

Date: 2025-09-26 08:25:17
Score: 2.5
Natty:
Report link

The solution for this question is a custom project which I made which makes it possible to sanitize data from the logging.

See
- https://github.com/StefH/SanitizedHttpLogger
- https://www.nuget.org/packages/SanitizedHttpClientLogger
- https://www.nuget.org/packages/SanitizedHttpLogger

And see this blogpost for more explanation and details:
- https://mstack.nl/blogs/sanitize-http-logging/

Reasons:
  • Blacklisted phrase (1): this blog
  • Probably link only (1):
  • Low length (0.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • High reputation (-1):
Posted by: Stef Heyenrath

79775624

Date: 2025-09-26 08:19:15
Score: 12 🚩
Natty: 6.5
Report link

Has this issue been resolved? I'm having the same problem.

Reasons:
  • Blacklisted phrase (1): I'm having the same problem
  • Blacklisted phrase (3): Has this issue been
  • RegEx Blacklisted phrase (1.5): resolved?
  • Low length (1.5):
  • No code block (0.5):
  • Me too answer (2.5): I'm having the same problem
  • Contains question mark (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Steven Park

79775618

Date: 2025-09-26 08:12:14
Score: 0.5
Natty:
Report link

So, the solution I arrived at was to use reticulate.

If someone has a pure R solution that follows a similar pattern, I would still be interested in hearing it and changing the accepted solution.

reticulate::py_require("polars[database]")
reticulate::py_require("sqlalchemy")

polars     <- reticulate::import("polars")
sqlalchemy <- reticulate::import("sqlalchemy")

engine    <- sqlalchemy$create_engine("sqlite:///transactions.sqlite3", future = TRUE)
dataframe <- polars$DataFrame(data.frame(x = 1:5, y = letters[1:5]))

with(
    engine$begin() %as% conn, 
    {
        dataframe$write_database("table_a", conn, if_table_exists = "append")
        dataframe$write_database("table_b", conn, if_table_exists = "append")
        dataframe$write_database("table_c", conn, if_table_exists = "append")
        stop("OOPS :(")
    }
)

Note: there was a bug in with() which the maintainers were kind enough to fix within a day, and this now works (i.e. the whole transaction is rolled-back upon error) with the latest branch.

Reasons:
  • Blacklisted phrase (1): :(
  • Long answer (-1):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: FISR

79775617

Date: 2025-09-26 08:10:13
Score: 1.5
Natty:
Report link

A line with a - in front of it will not make it to the new file.

A line with a + in front of it is not in the old file.

A line with no sign is in both files.

Ignore the wording:

If you want a - line to make it to the new file, delete the - but carefully leave an empty space in its place.

If you want a + line to not make it to the new file – just delete the line.

What could be simpler?

Don't forget to change the two pairs of numbers at the top so that, for each pair, the number to the right of the comma is exactly equal to the number of lines in the hunk for its respective file, or else the edit will be rejected. That was too much of a mouthful so they didn't bother explaining it.

Reasons:
  • Blacklisted phrase (1): What could be
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: danwroy

79775611

Date: 2025-09-26 08:08:12
Score: 7.5
Natty: 7.5
Report link

if I have 2 (or more - range loop generated) buttons calling the same callback, how do I know which one fired the event? How do I attach any data to the event?

Reasons:
  • Blacklisted phrase (1): how do I
  • Blacklisted phrase (1): How do I
  • Low length (1):
  • No code block (0.5):
  • Ends in question mark (2):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: miko

79775610

Date: 2025-09-26 08:07:12
Score: 0.5
Natty:
Report link

By just looking at your screenshot, the chances are high that you might use some CSS transform property on the component which leads to a scaling "bug" as transform is more for svg graphics than layouting.

for example:

transform: translateY(max(-50%, -50vh));

Try to use flex layouting instead

See: https://github.com/angular/components/issues/10710

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: reifi

79775606

Date: 2025-09-26 08:04:11
Score: 1.5
Natty:
Report link

You could turn the reference to Document into a onetoone instead of foreignkey, and that way you would have the option to set the cascadeDelete parameter to true.

If you are not allowed to alter the data model and drop the database you would need to create an upgrade trigger.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Enrique Alonso

79775599

Date: 2025-09-26 07:56:09
Score: 3
Natty:
Report link

Gotta love Multi platform tools that don't follow platform standards. C:\ProgramData, although not quite kosher, works just fine.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Miltiades STNE

79775598

Date: 2025-09-26 07:56:09
Score: 1
Natty:
Report link

I came accross this looking for a way to skip a non picklable attribute and based on JacobP's answer I'm using the below. It uses the same reference to skipped as the original instance.

def __deepcopy__(self, memo):
    cls = self.__class__
    obj = cls.__new__(cls)
    memo[id(self)] = obj
    for k, v in self.__dict__.items():
        if k not in ['skipped']:
            v = copy.deepcopy(v, memo)
        setattr(obj, k, v)
    return obj
Reasons:
  • Has code block (-0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: JonM

79775595

Date: 2025-09-26 07:51:08
Score: 1
Natty:
Report link

How to Add Hooks into CRM Software

Hooks in CRM software are automation triggers that allow you to connect your CRM with other applications or internal workflows. They save time, reduce manual work, and ensure smooth data flow across systems. Here’s how you can add hooks into a CRM:

  1. Identify Key Events

    • Decide which events should trigger a hook, such as:

      • When a new lead is created

      • When a deal is closed

      • When an invoice is generated

      • When an employee’s attendance is marked

  2. Use Webhooks or APIs

    • Most modern CRMs provide webhook or API integrations. A webhook pushes data to another application when a defined event occurs.

    • Example: If a new lead is added in CRM, a webhook can automatically send that lead’s details to your email marketing tool.

  3. Configure the Destination App

    • Decide where the data should go. Hooks can integrate your CRM with:

      • Email automation tools

      • Accounting software

      • HR or payroll systems

      • Inventory management solutions

  4. Test the Workflow

    • Run a test to ensure the hook is working properly. Check whether the data is being transferred correctly and triggers are firing without delay.
  5. Automate & Scale

    • Once working, expand your hooks to cover more business processes like customer support tickets, vendor updates, or sales pipeline changes.

By choosing a flexible platform like SYSBI Unified CRM, businesses can easily add hooks, streamline processes, and connect multiple operations without relying on separate tools.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Starts with a question (0.5): How to Add
  • Low reputation (1):
Posted by: SYSBI-Unified CRM

79775593

Date: 2025-09-26 07:49:07
Score: 1.5
Natty:
Report link

Actually These 3 input box are like parameters for vcvarsall.bat

So there's a hacky workaround: specify versions in any input box, as long as vcvarsall.bat recognize it:

example

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Tony

79775589

Date: 2025-09-26 07:47:07
Score: 1.5
Natty:
Report link

Well, looks like we had to copy over some more code from staging to live.
Then it worked. But the error is not very clear about what the problem is...

Reasons:
  • Whitelisted phrase (-1): it worked
  • Low length (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: MBouwman

79775588

Date: 2025-09-26 07:46:06
Score: 1
Natty:
Report link

in file project file add

<PropertyGroup>
  <EnableDefaultContentItems>false</EnableDefaultContentItems>
</PropertyGroup>

this It is forbidden SDK to add files Content automatic , and this save only you write in <Content Include="..." />

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: hoda mohamed

79775584

Date: 2025-09-26 07:42:05
Score: 0.5
Natty:
Report link

I eventually found a solution.

I think it's not clean but it works.

It use Installing the SageMath Jupyter Kernel and Extensions

venv/bin/python
>>> from sage.all import *
>>> from sage.repl.ipython_kernel.install import SageKernelSpec
>>> prefix = tmp_dir()
>>> spec = SageKernelSpec(prefix=prefix)
>>> spec.kernel_spec()

I correct each error by a symbolic link.

sudo ln -s /usr/lib/python3.13/site-packages/sage venv/lib/python3.13/site-packages/
sudo ln -s /usr/lib/python3.13/site-packages/cysignals venv/lib/python3.13/site-packages/
sudo ln -s /usr/lib/python3.13/site-packages/gmpy2 venv/lib/python3.13/site-packages/
sudo ln -s /usr/lib/python3.13/site-packages/cypari2 venv/lib/python3.13/site-packages/
sudo ln -s /usr/lib/python3.13/site-packages/memory_allocator

And finally,

\>>> spec.kernel_spec()
{'argv': ['venv/bin/sage', '--python', '-m', 'sage.repl.ipython_kernel', '-f', '{connection_file}'], 'display_name': 'SageMath 10.7', 'language': 'sage'}

I put this ting in
/usr/share/jupyter/kernels/sagemath/kernel.json.in
And it works.

Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Valère Bonnet

79775582

Date: 2025-09-26 07:38:04
Score: 1
Natty:
Report link

Original poster of the question here.
The reason why the ComboBox wasn't showing any items was because I seem to have missed the DataGridView's ReadOnly and left it on True.
After changing it to False, the ComboBox worked perfectly.
enter image description here

Here's the code:

DataGridViewComboBoxColumn column = new DataGridViewComboBoxColumn();

column.Items.Add("実案件");
column.Items.Add("参考見積り");

column.DataPropertyName = dataGridView_検索.Columns["見積もり日区分"].DataPropertyName;
dataGridView_検索.Columns.Insert(dataGridView_検索.Columns["見積もり日区分"].Index, column);
dataGridView_検索.Columns.Remove("見積もり日区分");
column.Name = "見積もり日区分";     
column.HeaderText = "見積もり日区分";
column.FlatStyle = FlatStyle.Flat;
column.DisplayStyle = DataGridViewComboBoxDisplayStyle.ComboBox;
column.DefaultCellStyle.BackColor = Color.FromArgb(255, 255, 192);
column.MinimumWidth = 150;
Reasons:
  • Probably link only (1):
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: Anpo Desu

79775581

Date: 2025-09-26 07:37:04
Score: 0.5
Natty:
Report link

When a path parameter is present and contains a very long path, the API often ignores the visible parameter, then adjusts the map's center so that the entire path is still visible.

Considering that you only want to show a specific segment, the most reliable workaround would be to use the center and zoom parameters:

zoom=18&center=51.47830481493033,5.625173621802276&key=XXX
Reasons:
  • Has code block (-0.5):
  • Starts with a question (0.5): When
  • Low reputation (0.5):
Posted by: Lanceslide

79775568

Date: 2025-09-26 07:14:59
Score: 1
Natty:
Report link

Issue resolved by simply following this [youtube video](https://www.youtube.com/watch?v=QuN63BRRhAM), Its officially from expo

- see my current package.json

{
  "name": "xyz",
  "version": "1.0.0",
  "scripts": {
    "start": "expo start --dev-client",
    "android": "expo run:android",
    "ios": "expo run:ios",
    "web": "expo start --web"
  },
  "dependencies": {
    "@expo/vector-icons": "^15.0.2",
    "@react-native-async-storage/async-storage": "2.2.0",
    "@react-native-community/datetimepicker": "8.4.4",
    "@react-native-community/netinfo": "^11.4.1",
    "@react-navigation/native": "^6.1.18",
    "@react-navigation/stack": "^6.3.20",
    "@supersami/rn-foreground-service": "^2.2.1",
    "base-64": "^1.0.0",
    "date-fns": "^3.6.0",
    "expo": "^54.0.10",
    "expo-background-fetch": "~14.0.6",
    "expo-build-properties": "~1.0.7",
    "expo-calendar": "~15.0.6",
    "expo-camera": "~17.0.7",
    "expo-dev-client": "~6.0.11",
    "expo-font": "~14.0.7",
    "expo-gradle-ext-vars": "^0.1.2",
    "expo-image-manipulator": "~14.0.7",
    "expo-image-picker": "~17.0.7",
    "expo-linear-gradient": "~15.0.6",
    "expo-location": "~19.0.6",
    "expo-media-library": "~18.2.0",
    "expo-sharing": "~14.0.7",
    "expo-status-bar": "~3.0.7",
    "expo-task-manager": "~14.0.6",
    "expo-updates": "~29.0.9",
    "framer-motion": "^11.5.4",
    "jwt-decode": "^4.0.0",
    "react": "19.1.0",
    "react-dom": "19.1.0",
    "react-native": "0.81.4",
    "react-native-background-fetch": "^4.2.7",
    "react-native-background-geolocation": "^4.18.4",
    "react-native-calendars": "^1.1306.0",
    "react-native-gesture-handler": "~2.28.0",
    "react-native-jwt": "^1.0.0",
    "react-native-linear-gradient": "^2.8.3",
    "react-native-modal-datetime-picker": "^18.0.0",
    "react-native-month-picker": "^1.0.1",
    "react-native-reanimated": "~4.1.1",
    "react-native-reanimated-carousel": "^4.0.3",
    "react-native-safe-area-context": "~5.6.0",
    "react-native-screens": "~4.16.0",
    "react-native-vector-icons": "^10.1.0",
    "react-native-view-shot": "~4.0.3",
    "react-native-webview": "13.15.0",
    "react-native-worklets": "0.5.1",
    "react-swipeable": "^7.0.1",
    "rn-fetch-blob": "^0.12.0"
  },
  "devDependencies": {
    "@babel/core": "^7.20.0",
    "@babel/plugin-transform-private-methods": "^7.24.7",
    "local-ip-url": "^1.0.10",
    "rn-nodeify": "^10.3.0"
  },
  "resolutions": {
    "react-native-safe-area-context": "5.6.1"
  },
  "private": true,
  "expo": {
    "doctor": {
      "reactNativeDirectoryCheck": {
        "exclude": [
          "@supersami/rn-foreground-service",
          "rn-fetch-blob",
          "base-64",
          "expo-gradle-ext-vars",
          "framer-motion",
          "react-native-jwt",
          "react-native-month-picker",
          "react-native-vector-icons",
          "react-swipeable"
        ]
      }
    }
  }
}
Reasons:
  • Blacklisted phrase (1): youtube.com
  • Long answer (-1):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Ganesh Mohane

79775566

Date: 2025-09-26 07:12:59
Score: 1.5
Natty:
Report link

Just in case someone come to this page on the same reason as I do. I migrated application to Java 17, but my services on Ignite are still on Java 11 for some reason. Calling that service throws an exception "Ignite failed to process request [142]: Failed to deserialize object [typeId=-1688195747]"

The reason was that I'm using stream method toList() in my Java 17 app and call service on Ignite with argument that contains such List. Replacing with collect(Colelctors.toList()) solved the issue.

Reasons:
  • No code block (0.5):
  • Low reputation (1):
Posted by: Михаил Байков

79775565

Date: 2025-09-26 07:11:58
Score: 1.5
Natty:
Report link

No, the total size of your database will have a negligible impact on the performance of your queries for recent data, thanks to ClickHouse's design.

Your setup is excellent for this type of query, and performance should remain fast even as the table grows.Becouse of these things,

Reasons:
  • Blacklisted phrase (0.5): thanks
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Madusha Prasad

79775545

Date: 2025-09-26 06:45:52
Score: 1
Natty:
Report link

Linear Regression is a good starting point for predicting medical insurance costs. The idea is to model charges as a function of features like age, BMI, number of children, smoking habits, and region.

Steps usually include:

  1. Prepare the data – encode categorical variables (like sex, smoker, region) into numerical values.

  2. Split the data – use train-test split to evaluate the model’s performance.

  3. Train the model – fit Linear Regression on training data.

  4. Evaluate – use metrics like Mean Squared Error (MSE) and R² score to check accuracy.

  5. Predict – use the model to estimate charges for new individuals based on their features.

Keep in mind: Linear Regression works well if the relationship is mostly linear. For more complex patterns, Polynomial Regression or Random Forest can improve predictions.

If you want, I can also share a Python example with dataset and code for better understanding.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Raviii

79775539

Date: 2025-09-26 06:35:50
Score: 2.5
Natty:
Report link

It's typically safe without any guarantee.

As mentioned in @axe 's answer.

It's okay if any impl of string stores as a sequential character array, but it's not a standard guarantee.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • User mentioned (1): @axe
  • Low reputation (0.5):
Posted by: KSroido

79775535

Date: 2025-09-26 06:29:48
Score: 2.5
Natty:
Report link

Just so the info is here.

Instead of arec and aplay

You should use tinycap with tinyalsa on android from what i remember.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Kin_G

79775517

Date: 2025-09-26 06:00:41
Score: 2
Natty:
Report link

Unexpected Git conflicts occur when multiple people make changes to the same lines of a file or when merging branches with overlapping edits. Git can’t automatically decide which change to keep, so manual resolution is needed.

read more;https://www.nike.com/

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Fiza Rao

79775510

Date: 2025-09-26 05:45:38
Score: 1.5
Natty:
Report link

I guess you need to use double curly paranthesis in your prompt to avoid string manipulation errors. I know the error message doesn't seem to be related to that.

Instead of {a: b} -> {{a: b}}

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Hasan Salim Kanmaz

79775490

Date: 2025-09-26 05:16:30
Score: 1
Natty:
Report link

Azure DevSecOps brings security into every stage of DevOps using a mix of Azure-native and third-party tools:

👉 At Cloudairy, we design DevSecOps pipelines that integrate these tools to keep code, infrastructure, and operations secure, compliant, and automated.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Darshan Cloudairy

79775487

Date: 2025-09-26 05:09:29
Score: 5
Natty: 4.5
Report link

look ! this can be more helpful

https://medium.com/@shahbishwa21/automate-daily-commits-with-random-content-using-github-actions-804736759c1d

Reasons:
  • Blacklisted phrase (0.5): medium.com
  • Probably link only (1):
  • Low length (2):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Menula De Silva

79775475

Date: 2025-09-26 04:40:22
Score: 1
Natty:
Report link

Along with all other Azure products, Cognitive Services is part of the official collection of Azure architecture symbols that Microsoft provides. It is advised to use these icons in solution and architectural diagrams.

Get Azure Architecture Icons here.

Formats: SVG, PNG, and Visio stencils that work with programs like Lucidchart, Draw.io, PowerPoint, and Visio.

Service categories are used to arrange the icons. Cognitive Services is located in the AI + Machine Learning category.

Microsoft updates and maintains these icons to make sure they match the Azure logo.

Your architecture diagrams will adhere to Microsoft's design guidelines and maintain their visual coherence if you use these official icons.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Darshan Cloudairy

79775467

Date: 2025-09-26 04:14:17
Score: 1
Natty:
Report link

You can try to clean the Gradle caches to force a fresh download:

flutter clean
rm -rf ~/.gradle/wrapper/dists ~/.gradle/caches android/.gradle
flutter pub get

and then check the wrapper URL:

distributionUrl=https\://services.gradle.org/distributions/gradle-8.7-bin.zip

retry:

flutter run -v
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Gladys chan

79775466

Date: 2025-09-26 04:04:15
Score: 0.5
Natty:
Report link

You can also implement it yourself in a Spring Boot 2 application using Spring’s ApplicationEvent and Transaction Synchronization.
You can follow below steps : -

-** Create an outbox table with columns for unique ID, event type, payload, and timestamp to persist events.

- Use a single database transaction to save both business data and the corresponding event to the outbox table.

- Implement a scheduled job to poll the outbox table, send unsent events to their destination, and then mark them as sent or delete them.

- Design event consumers to be idempotent, ensuring they can safely process duplicate messages without side effects.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: shailesh patil

79775462

Date: 2025-09-26 03:46:12
Score: 1
Natty:
Report link

Mine was solved because I had Platforms in my csproj:

<Platforms>x64;x86</Platforms>

I had to remove it for it to start building correctly.

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Raid

79775454

Date: 2025-09-26 03:12:05
Score: 0.5
Natty:
Report link
Reasons:
  • Blacklisted phrase (1): this guide
  • Long answer (-1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: CuongLc92

79775452

Date: 2025-09-26 03:11:05
Score: 1
Natty:
Report link

It solves my problem this time. I added pyproject.toml file along with setup.py

Content of pyproject.toml

[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"

It generated .whl file only for that specific package.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: awesome_sangram

79775444

Date: 2025-09-26 03:00:02
Score: 0.5
Natty:
Report link

Unknow the causes, but the dumpfile shows filesystem::path::~path free before init. It's a bug in Clang 20.1, have been fixed in Clang 21+

could be a bug related to Compiler Reordering feature

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: KSroido

79775431

Date: 2025-09-26 02:33:57
Score: 0.5
Natty:
Report link

Simply, Convert the results to an Eloquent Collection

use Illuminate\Support\Collection; // Import if not already imported
$acc = DB::select('select id,name from accounts limit 5');
return  $collection = collect($acc);
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Hussein mahyoub

79775425

Date: 2025-09-26 02:13:53
Score: 0.5
Natty:
Report link
  1. It's not an UB

  2. As long as you know what you're doing, it's OK to use anything as long as they can be compiled, that's how the unsafe works

  3. If the UnsafeCell write at very begining of &T reading, it's an UB. If that never happens, then it's safe for using it.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: KSroido

79775424

Date: 2025-09-26 02:09:51
Score: 1
Natty:
Report link

I would like to express my sincere gratitude to @ Christoph Rackwitz for his suggestion. By visiting the website he shared, I obtained useful information. Given that there are very few online tutorials mentioning the use of Nvidia GeForce RTX 50 series graphics cards for compiling Cuda and OpenCV library files at the current date, I am sharing my successful compilation experience here.

The version numbers of the various software and drivers I use are as follows:

OS : Windows 11

Cmake:3.28.0

Nvidia Cuda Version : 13.0

Cuda Toolkit:cuda 12.9

cudnn:9.13

Visual Studio:Microsoft Visual Studio Professional 2022 (x64)- LTSC 17.6,Version:17.6.22

OpenCV/OpenCV-contrib:4.13.0-dev, Make sure to download the latest repository files from the OpenCV's Github website. The source code of version 4.12 of OpenCV cannot fully support the Nvidia Cuda Toolkit, and it will cause many problems.

Python Interpreter:Python 3.13.5, I installed a standalone Python interpreter specifically for compiling the OpenCV library files used in Python programming.

CMake flags:

1.Check "WITH_CUDA", "OPENCV_DNN_CUDA" , OPENCV_DNN_OPENVINO(or OPENCV_DNN_OPENCL/OPENCV_DNN_TFLITE), individually, and do not check "BUILD_opencv_world", Set the path of OPENCV_EXTRA_MODULES_PATH, for example: D:/SoftWare/OpenCV_Cuda/opencv_contrib-4.x/modules;

2.Set the values of CUDA_ARCH_BIN and NVIDIA PTX ARCHs to 12.0, check WITH_CUDNN,

3. Check "OPENCV_ENABLE_NONFREE"; If you want to compile the OpenCV library file used for Python programming, the numpy library needs to be installed in the installation path of the Python interpreter. You also need to set the following several paths, for example:

PYTHON3_EXECUTABLE: D:/SoftWare/Python313/python.exe

PYTHON3_INCLUDE_DIR: D:/SoftWare/Python313/include

PYTHON3_LIBRARY: D:/SoftWare/Python313/libs/python310.lib

PYTHON3_NUMPY_INCLUDE_DIRS: D:/SoftWare/Python313/Lib/site-packages/numpy/_core/include

PYTHON3 PACKAGES PATH: D:/SoftWare/Python313/Lib/site-packages

4.Then check BUILD_opencv_python3 and ENABLE_FAST_MATH

After the configuration is completed, use CMake's "generate" function to create "OpenCV.sln". Open "OpenCV.sln" with Visual Studio, and complete the final compilation process by using "ALL BUILD" and "INSTALL". As long as there are no errors reported by Visual Studio, the OpenCV library files have been compiled successfully.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Leon Brant

79775423

Date: 2025-09-26 02:07:50
Score: 8.5 🚩
Natty: 5
Report link

has this been fixed? I am facing the same issue and not sure what is wrong.

Reasons:
  • RegEx Blacklisted phrase (1.5): fixed?
  • Low length (1.5):
  • No code block (0.5):
  • Me too answer (2.5): I am facing the same issue
  • Contains question mark (0.5):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Karthik

79775415

Date: 2025-09-26 01:45:45
Score: 3
Natty:
Report link

You should use "Union All" when you are joining two sources with the same number of columns and columns of similar nature i.e You want all the records from both the sources.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Aurangzeb Khan

79775414

Date: 2025-09-26 01:38:43
Score: 2.5
Natty:
Report link

I couldn't find a way to input empty string through Airflow UI neither. My work around is to input a space and strip the param in code.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: iucario

79775398

Date: 2025-09-26 00:58:35
Score: 3.5
Natty:
Report link

Native Image has the restriction that "Properties that change if a bean is created are not supported (for example, @ConditionalOnProperty and .enabled properties)" due to the closed-world assumption.

https://docs.spring.io/spring-boot/reference/packaging/native-image/introducing-graalvm-native-images.html#packaging.native-image.introducing-graalvm-native-images.understanding-aot-processing

Reasons:
  • Probably link only (1):
  • Low length (0.5):
  • No code block (0.5):
  • User mentioned (1): @ConditionalOnProperty
  • Low reputation (0.5):
Posted by: Toshiaki Maki

79775394

Date: 2025-09-26 00:46:32
Score: 0.5
Natty:
Report link

You forgot to double quote the string:

set(compileOptions "-Wall -Wextra -O3 -Wno-narrowing -ffast-math -march=native")

So, what ends up happening is that compileOptions is set to "-Wall," only, and the other tokens, such as "-Wextra", "-O3", etc, are parsed as options to the set command.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Jack D Menendez

79775386

Date: 2025-09-26 00:09:24
Score: 3
Natty:
Report link

you need to reference $PARAMETER1 instead $@ in the inline script command. These parameters are at the arm level. They will not pass arguments to script

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: World9

79775383

Date: 2025-09-25 23:52:20
Score: 1.5
Natty:
Report link

Use the distinct function, like this:

distinct (column_name)
Reasons:
  • Low length (1.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Dipankar Biswas

79775376

Date: 2025-09-25 23:30:16
Score: 2.5
Natty:
Report link

Your code is correct. You are getting an error because of a known bug in Playground.

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
Posted by: Ben A.

79775361

Date: 2025-09-25 22:51:08
Score: 1.5
Natty:
Report link

Please consider using finally block to reset your implicit wait to default value, it's less error prone and avoids code duplication.

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Davide Sorcelli

79775360

Date: 2025-09-25 22:48:08
Score: 0.5
Natty:
Report link

I fixed this by not using sudo for my command.

Reasons:
  • Whitelisted phrase (-2): I fixed
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
Posted by: Spencer Goff

79775359

Date: 2025-09-25 22:46:07
Score: 1.5
Natty:
Report link

In other for LIME to work correctly and effectively, this will require probability scores rather than simple predictions.

The current setup uses rf.predict, which produces 0/1 labels. For LIME to receive detailed probability distribution, use rf.predict_proba, this will properly explain the predictions.

To solve this, when calling explainer.explain_instance switch to rf.predict_proba. This adjustment will allow LIME to access probability score necessary for its analysis.

Reasons:
  • No code block (0.5):
  • Low reputation (1):
Posted by: user30818063

79775349

Date: 2025-09-25 22:35:04
Score: 2.5
Natty:
Report link

After upgrading from "expo": "54.0.7", to "expo": "54.0.8", I was finally able to run eas build -p ios successfully today.

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Kelton Cabral

79775343

Date: 2025-09-25 22:20:00
Score: 4.5
Natty: 5.5
Report link

Sir help me code: Sarangheo Autotype javascript..

Reasons:
  • Blacklisted phrase (1): help me
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Glenn Francis Barrio

79775337

Date: 2025-09-25 22:09:57
Score: 2.5
Natty:
Report link

The solution for me was to switch from path.toShapes(true) to SVGLoader.createShapes(path) when using ExtrudeGeometry for the shapes.

Incorrect rendering Correct rendering

Reasons:
  • Probably link only (1):
  • Low length (1):
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Alex Popov

79775332

Date: 2025-09-25 21:56:53
Score: 2
Natty:
Report link

The issue was ultimately the workflow steps and not getting all the session keys properly set. When clicking on Sign In Button, you go to https://auth.pff.com. I tried going directly to https://auth.pff.com. However, when adjusting and going to https://premium.pff.com and clicking on "sign in" button, everything populated correctly. For some reason the Session key for "loggedIn" was not getting set to True otherwise.

I did have to add 1-2 second sleep as well to make sure the Captcha Loaded... no interaction with it, but you just had to let it load.

Reasons:
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Kevin3142

79775330

Date: 2025-09-25 21:50:52
Score: 1.5
Natty:
Report link

You can find step by step explaination and you can use custom input for aho corasick algorithm here.

https://antinloop.com/algorithms/aho-corasick/demo

Reasons:
  • Whitelisted phrase (-1.5): you can use
  • Probably link only (1):
  • Low length (1):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: abdelkarim

79775328

Date: 2025-09-25 21:47:51
Score: 1
Natty:
Report link

You could do this with randcraft

from randcraft.constructors import make_discrete

bernoulli = make_discrete(values=[0, 1], probabilities=[0.8, 0.2])
bernoulli_100 = bernoulli.multi_sample(100)
bernoulli_100.plot()

results = bernoulli_100.sample_numpy(5)
print(results)
# [10. 15. 20. 14. 24.]

Result PDF

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: SusRel

79775323

Date: 2025-09-25 21:42:50
Score: 9
Natty: 7
Report link

where did u get the Bluetooth sdk for the ACR1255U-J1 from because mine came only with a java sdk which wont work for android?

Reasons:
  • RegEx Blacklisted phrase (3): did u get the
  • Low length (1):
  • No code block (0.5):
  • Ends in question mark (2):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Starts with a question (0.5): where did
  • Low reputation (1):
Posted by: NewbieHobbycoder

79775322

Date: 2025-09-25 21:40:49
Score: 3
Natty:
Report link

I found the answer for this

Had to allow this permission for the EKS node IAM role

ecr:BatchImportUpstreamImage

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: Shiran Joseph

79775309

Date: 2025-09-25 21:22:44
Score: 0.5
Natty:
Report link

try installing rosetta via softwareupdate --install-rosetta. i had the same issue and when running xcrun simctl list runtimes -v and saw it mentioned a lack of rosetta.

Reasons:
  • Whitelisted phrase (-1): i had the same
  • Low length (1):
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Umar

79775304

Date: 2025-09-25 21:10:41
Score: 3.5
Natty:
Report link

I have been facing the same issue that you have described.

After updating the library "com.google.android.gms:play-services-ad" to version "24.6.0" it got solved.

This version was realesed on September 9th and it is the latest.

I hope it works for you too!

https://mvnrepository.com/artifact/com.google.android.gms/play-services-ads/24.6.0

Reasons:
  • Whitelisted phrase (-1): hope it works
  • Low length (0.5):
  • No code block (0.5):
  • Me too answer (2.5): facing the same issue
  • Low reputation (1):
Posted by: Victor Cioffi

79775295

Date: 2025-09-25 20:51:37
Score: 0.5
Natty:
Report link

If the problem is located in a third-party gem instead of your own code, then it might be easier to use Andi Idogawa's file_exists gem, at least temporarily (explanatory blog post).

bundle add file_exist

Then add to e.g. config/boot.rb:

require 'file_exists'
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: 3wordchant

79775290

Date: 2025-09-25 20:36:33
Score: 3
Natty:
Report link

Using an ontology to guide the tool sounds smart like checking everything carefully when you bonuskaart scannen to make sure it works as expected.

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: johsephkarter7

79775289

Date: 2025-09-25 20:36:33
Score: 2.5
Natty:
Report link

In case u’r struggling with calendly and only need an api, check out recal https://github.com/recal-dev. We also open-sourced our scheduling sdk and are integrating a calendly wrapper api rn. If u want early access, just shoot me a message [email protected]

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Alex E

79775272

Date: 2025-09-25 20:16:25
Score: 10.5 🚩
Natty: 6
Report link

Did you manage to run it?

i have similar problem with H747

Reasons:
  • Blacklisted phrase (1): i have similar
  • RegEx Blacklisted phrase (3): Did you manage to
  • Low length (1.5):
  • No code block (0.5):
  • Me too answer (2.5): i have similar problem
  • Contains question mark (0.5):
  • Starts with a question (0.5): Did you
  • Low reputation (1):
Posted by: Patryk Kucia

79775264

Date: 2025-09-25 20:06:23
Score: 1.5
Natty:
Report link
mkdir /tmp/podman-run_old
mv -v /tmp/podman-run-* /tmp/podman-run_old/
# start all dead containers
podman start $(podman ps -qa)
Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Łukasz Filipek

79775259

Date: 2025-09-25 20:00:21
Score: 1
Natty:
Report link

I would turn to window functions and perhaps a common table expression such as:

with cte as (select * ,

row_number() over (partition by multiplier,id) as lag_multiplier

from table)

update table set id =concat(cast(cte.id,int),cast(cte.lag_multipier)

where id in (select id from table where multiplier!=0)

from table

join cte using(id);

/*Note that I don't work with UPDATE much, and haven't tested this query. So the syntax might be off. It's also a little expensive. I'm not sure if that can be improved. Best of luck.*/

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Lee

79775255

Date: 2025-09-25 19:56:20
Score: 5.5
Natty:
Report link

Habe you Solved this Problem? I Think I have an Similar issue. Br Joachim

Reasons:
  • RegEx Blacklisted phrase (1.5): Solved this Problem?
  • Low length (1.5):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Joachim Willner

79775244

Date: 2025-09-25 19:39:15
Score: 3.5
Natty:
Report link

This is definitely feasible, but we would need to look at your webhook listener code.

From the Docusign part, please refer to this documentation on how to use and setup Connect notifications.

https://developers.docusign.com/platform/webhooks/

Thank you.

Reasons:
  • Blacklisted phrase (0.5): Thank you
  • Blacklisted phrase (1): this document
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Angel Garcia Reyes

79775234

Date: 2025-09-25 19:26:13
Score: 0.5
Natty:
Report link

I popped by here when researching the 255 Transpose Limit, as I expect others have and may. I got a bit thrown of course, but finally straightened it out in my brain, and so thought I could make a worthwhile contribution for others passing in the future.

There are two issues here, which may not be immediate obvious.

_1) The Transpose function does not like it if it is working on a Variant element type array, where one or more of the array elements are a string of more than 255 characters.

If we are dealing with 1 dimensional arrays, as in the original question, then there is a way to get over this without looping, and still using the Transpose function: Use the Join Function on the Variant array (with arbitrary separator), then use Split function on that. We then end up with a String array, and the Transpose is happy with any elements with more than 255 characters in them

This next demo coding almost gets what was wanted here, and variations of it may be sufficient for some people having an issue with the 255 Transpose Limit.

Sub RetVariantArrayToRange() '  
 Let ActiveSheet.Range("M2:M5") = TransposeStringsOver255()  
End Sub  
Function TransposeStringsOver255()  
Dim myArray(3) As Variant  'this the variant array I will attempt to write  
  
' Here I fill each element with more than 255 characters  
myArray(0) = String(300, "a")  
myArray(1) = String(300, "b")  
myArray(2) = String(300, "c")  
myArray(3) = String(300, "d")  '    
  
' Let TransposeStringsOver255 = Application.Transpose(myArray()) ' Errors because Transpose does not work on a   Variant  type array if any element is a string greater than 255 characters  
  
Dim strTemp As String, myArrayStr() As String  
 Let strTemp = Join(myArray(), "|")  
 Let myArrayStr() = Split(strTemp, "|")  
  
 Let TransposeStringsOver255 = Application.Transpose(myArrayStr())  
  
End Function  

_2) That last coding does not do exactly what was wanted. The specific requirement was along these lines, (if using the function above) :
…..select an area of 4 rows x 1 column and type "=TransposeStringsOver255()" into the formula bar (do not enter the quotes). and hit (control + shift + enter)…..
That last coding does not work to do exactly that.
As Tim Williams pointed out, the final array seems to need to be a String array (even if being held in a Variant variable ). Why that should be is a mystery, since the demo coding above seems to work as a workaround to Transpose Strings Over 255 in a Variant Array To a Range.

To get over the problem, we loop the array elements into a String array. Then the mysterious problem goes away.
This next coding would be the last coding with that additional bit

Function TransposeStringsOver255VariantArrayToSelectedRange()  
Dim myArray(3) As Variant  'this the variant array I will attempt to write  
  
' Here I fill each element with more than 255 characters  
myArray(0) = String(300, "a")  
myArray(1) = String(300, "b")  
myArray(2) = String(300, "c")  
myArray(3) = String(300, "d") ' -  
  
' Let TransposeStringsOver255VariantArrayToSelectedRange = Application.Transpose(myArray()) ' Errors because Transpose does not work on a   Variant  type array if any element is a string greater than 255 characterts  
  
Dim strTemp As String, myArrayStr() As String  
 Let strTemp = Join(myArray(), "|")  
 Let myArrayStr() = Split(strTemp, "|")  
  
' Let TransposeStringsOver255VariantArrayToSelectedRange = Application.Transpose(myArrayStr()) ' Errors because  "Seems like you need to return a string array"   Tim Williams: https://stackoverflow.com/a/35399740/4031841  
Dim VarRet() As Variant  
 Let VarRet() = Application.Transpose(myArrayStr())  
Dim strRet() As String, Rw As Long  
 ReDim strRet(1 To UBound(VarRet(), 1), 1 To 1)  
    For Rw = 1 To UBound(VarRet(), 1)  
     Let strRet(Rw, 1) = VarRet(Rw, 1)  
    Next Rw  
 Let TransposeStringsOver255VariantArrayToSelectedRange = strRet()  
End Function  

To compare in the watch window:
The first coding ends up getting this array, which in many situations will get the job done for you
https://i.postimg.cc/fWYQvsTy/c-Transpose-Strings-Over255.jpg

But for the exact requirement of this Thread, we need what the second coding gives us, which is this:
https://i.postimg.cc/FRL585yP/f-Transpose-Strings-Over255-Variant-Array-To-Selected-Range.jpg

_.______________________________________-

Since we are now having to loop through each element, then we might just as well forget about the Transpose function , and change the loop slightly to do the transpose at the same time

Function TransposeStringsOver255VariantArrayToSelectedRange2()  
Dim myArray(3) As Variant  'this the variant array I will attempt to write  
  
' Here I fill each element with more than 255 characters  
myArray(0) = String(300, "a")  
myArray(1) = String(300, "b")  
myArray(2) = String(300, "c")  
myArray(3) = String(300, "d") ' -  
  
Dim strRet() As String, Rw As Long  
 ReDim strRet(1 To UBound(myArray()) + 1, 1 To 1)  
    For Rw = 1 To UBound(myArray()) + 1  
     Let strRet(Rw, 1) = myArray(Rw - 1)  
    Next Rw  
 Let TransposeStringsOver255VariantArrayToSelectedRange2 = strRet()  
End Function  

We have now arrived at a solution similar to that from Tim Williams.
(One thing that initially threw me off a bit, was the second function from Tim Williams, as some smart people told me that to get an array out of a function, then it must be
Function MyFunc() As Variant
I never saw a function like
Function MyFunc() As String()
Before)

Hoping this bit of clarification may help some people passing as I did

Alan

Reasons:
  • Blacklisted phrase (1): stackoverflow
  • Long answer (-1):
  • Has code block (-0.5):
  • Filler text (0.5): ______________________________________
  • Low reputation (0.5):
Posted by: Alan Elston

79775228

Date: 2025-09-25 19:16:09
Score: 5.5
Natty: 5
Report link

Not an answer but an extension of the question.

If I want to copy the contents of say File1 to a new File2 while only being able to have one file open at a time in SD.

It seems that I can open File1 and read to a buffer until say a line end, and then close File1, open File2 and write to File2. Close File2 and reopen File1.

Then I have a problem, having reopened File1 I need to read from where I had got to when I last closed it. Read the next until say line end, close File1, reopen File2 as append and write to File2.

The append means that File 2 gradually accumulates the information so no problem but I am unclear as to how in File1 I return to the last read location.

Do I need to loop through the file each time I open it for the number of, until line end, reads previously done?

Reasons:
  • Blacklisted phrase (0.5): I need
  • Blacklisted phrase (1): Not an answer
  • RegEx Blacklisted phrase (1): I want
  • Long answer (-0.5):
  • No code block (0.5):
  • Ends in question mark (2):
  • Low reputation (1):
Posted by: Harry J Crowley

79775220

Date: 2025-09-25 19:08:07
Score: 4.5
Natty: 6
Report link

This thread looks too old but I came across to similar issue.

I am trying to copy millions of files from 1 server to another over network.

When I use the robocopy code without /mt, it looks working fine. But when I add /mt, /mt:2 etc. it stuck on same screen as above. Ram usage increasing. I have waited 20 minutes but nothing happened. It just copied the folders but not the files inside. This happens in win server 2016.

Anyone may suggest something ?

Reasons:
  • Blacklisted phrase (1): I am trying to
  • No code block (0.5):
  • Ends in question mark (2):
  • Low reputation (1):
Posted by: Emrah BAYSAN

79775216

Date: 2025-09-25 19:00:04
Score: 1.5
Natty:
Report link

To target a specific file size (worked for jpeg), say 300kb:

convert input.jpg -define jpeg:extent=300kb output.jpg

Forces output file to be about 300 KB

Reasons:
  • Low length (1):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Daniel Aliu

79775204

Date: 2025-09-25 18:46:01
Score: 1
Natty:
Report link

It seems the issue was within Flutter's code and my IDE was trying to debug it.

My VS Code debugging configuration was set to "Debug my code + packages" so it was also trying to debug Flutter's code and that's why it would open up binding.dart because there was an error in that code.

Setting debugging config to just "Debug my code" should fix this problem!

You can do this from the bottom left in VS Code, just next to the error count and warning counts.

Edit: You can only change this when you're running a debug session. Launch a debug instance and the toggle to change this should appear in the bottom left corner.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: snowyvibes

79775203

Date: 2025-09-25 18:46:01
Score: 0.5
Natty:
Report link

Kafka is a steam not a format.

df = spark \
    .readStream \
    .format("kafka") \
    .option("kafka.bootstrap.servers", "localhost:9092") \
    .option("subscribe", "sparktest") \
    .option("startingOffsets", "earliest") \
    .load()
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Frank

79775201

Date: 2025-09-25 18:42:00
Score: 3
Natty:
Report link

It's Python 3.12 issue, try downgrading to 3.11

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Frank

79775200

Date: 2025-09-25 18:42:00
Score: 0.5
Natty:
Report link

In your nuxt.config.ts do:

// https://nuxt.com/docs/api/configuration/nuxt-config

export default defineNuxtConfig({
  $production: {
   nitro: {
    preset: 'aws-amplify',
    awsAmplify: {
     runtime: "nodejs22.x"
    },
   },
  },
});
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Emiliano Díaz

79775196

Date: 2025-09-25 18:38:59
Score: 1
Natty:
Report link

I know it is old thread but I still faced this issue on windows and finally get working solution after multiple attempts

$OutputEncoding = [System.Text.Encoding]::UTF8
[System.Console]::OutputEncoding = [System.Text.Encoding]::UTF8
python script.py > output.txt
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: broman

79775187

Date: 2025-09-25 18:24:55
Score: 0.5
Natty:
Report link

I once had to change my CER file from "UTF-16 LE BOM" to "UTF-8". Im not sure how this applies to you directly, but thats basically the error i got from openssl when working with certificates with the wrong text encoding.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: rocketsarefast

79775183

Date: 2025-09-25 18:19:53
Score: 0.5
Natty:
Report link

I once had to change my CER file from "UTF-16 LE BOM" to "UTF-8". Im not sure how this applies to you directly, but thats basically the error i got from openssl when working with certificates with the wrong text encoding.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • High reputation (-1):
Posted by: rocketsarefast

79775178

Date: 2025-09-25 18:13:52
Score: 1
Natty:
Report link

I also faced the same issue for so many years. and nothing found on the internet. But after so a long time, I finally got a solution for this. I found a little but excellent working add-on for this, the link I am giving below.

It's very easy, we have to just install the addon, then copy Excel data from Excel, then go to the Thunderbird compose window, and just press key combination CTRL + Q, and you are done.

No need for MS Word or any other kind of word processor. you data will be pasted as it is with rich text formating with colors also.

https://addons.thunderbird.net/en-US/thunderbird/addon/paste-excel-table-into-compose/

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Abhijit

79775168

Date: 2025-09-25 18:05:49
Score: 0.5
Natty:
Report link

In 2025 I just renamed C:\project\.git\hooks\pre-commit.sample to pre-commit

#!/bin/sh
echo "🚀 Run tests..."
php artisan test
if [ $? -ne 0 ]; then
  echo "❌ Test failed!"
  exit 1
fi
echo "✅ Passed, pushing..."
exit 0
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Vit

79775136

Date: 2025-09-25 17:32:36
Score: 2
Natty:
Report link

I believe this has something to do with virtualization but I don't fully understand what's going on, why is this and how do I fix it.

Virtualization is simple: If you have 10000 strings, the UI will only create however many ListViewItem controls are needed to fit the viewport.

When you set CanContentScroll to false, the ScrollViewer will "scroll in terms of physical units", according to the documentation. That means that all 10000 ListViewItems will be created, lagging the UI.

Is there a way to keep it False so it won't show an "empty line" at the end?

By keeping it false, you kill performance. If you want to get rid of the empty line at the bottom and eliminate lag, you should override ListView's VirtualizingStackPanel in order to override it's behavior.

<ListView ScrollViewer.CanContentScroll="True">
    <ListView.ItemsPanel>
        <ItemsPanelTemplate>
            <VirtualizingStackPanel ScrollUnit="Pixel"
                                    IsVirtualizing="True"/>
        </ItemsPanelTemplate>
    </ListView.ItemsPanel>
</ListView> 

ScrollUnit="Pixel" makes the ScrollUnit be measured in terms of pixels, which should eliminate the empty line at the bottom.

Reasons:
  • Blacklisted phrase (1): how do I
  • Blacklisted phrase (1): Is there a way
  • RegEx Blacklisted phrase (0.5): why is this
  • Long answer (-1):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (0.5):
Posted by: vmHernandes

79775133

Date: 2025-09-25 17:30:36
Score: 3
Natty:
Report link

Same problem with blazor server. The nuget package BootstrapBlazor bundles the necessary bootstrap files in the staticwebassets folder so it should be properly deployed for blazor - and you can reference it as such:

<link href="_content/BootstrapBlazor/css/bootstrap.min.css" rel="stylesheet" />

Reasons:
  • RegEx Blacklisted phrase (1): Same problem
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Peter Murdoch

79775110

Date: 2025-09-25 16:54:26
Score: 5.5
Natty:
Report link

M facing the same issue while upgrading mu node app to node 18, and using serverless component3.6 nextjs 14 . Tried many ways didnt find any

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Me too answer (2.5): facing the same issue
  • Single line (0.5):
  • Low reputation (1):
Posted by: Rudra das

79775105

Date: 2025-09-25 16:43:23
Score: 4
Natty:
Report link

This is such a non isssue, just get better.

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Unregistered user (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Jones

79775100

Date: 2025-09-25 16:40:21
Score: 3
Natty:
Report link

public static bool IsNegative(this TimeSpan value) => value.Ticks < 0;

Reasons:
  • Low length (1.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Solarev Sergey

79775095

Date: 2025-09-25 16:38:21
Score: 1
Natty:
Report link

The yml:

    - name:  RUN PYTHON ON TARGET
      changed_when: false
      shell: python3 /.../try_python.py {{side_a}}
      become: true
      become_user: xxxx
      register: py_output

The script (adapted to AAP and tested locally):

# name = input()
with open("/.../try_txt.txt", "w") as file:
    file.write(f"{{$1}}")

The survey contains only the "side_a" variable, and it is working already for bash cases.

Reasons:
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: picgon

79775093

Date: 2025-09-25 16:37:20
Score: 0.5
Natty:
Report link

Since this question is a bit old and doesn't seem to have a clear answer, here is my proposed approach.

First, I would segment the large dataset into smaller, more manageable chunks based on a time window (for example, creating a separate DataFrame for each month). For each chunk, I would perform exploratory data analysis (EDA) to understand its distribution, using tools like histograms, Shapiro-Wilk/Kolmogorov-Smirnov tests for normality, and QQ-Plots.

In a real-world scenario with high-frequency data, such as a sensor recording at 100 Hz (i.e., one reading every 0.01 seconds), processing the entire dataset at once is impossible if you're working on a local machine. Therefore, I would take a representative sample of the data. I would conduct the EDA on this sample, then calculate the normalization parameters from it. These parameters would then be used as the basis to normalize the rest of the data for that period (e.g., the entire month).

By normalizing the data to a consistent range, such as [0,1], the different segments of data should become directly comparable.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Roberto Priego

79775077

Date: 2025-09-25 16:14:14
Score: 0.5
Natty:
Report link

The documentation is contradictory about what is the difference between volatile keyword and VH.setVolatile
I don't remember the chapter... but the one for VarHandle explicitly states that it resembled a fullFence... which means that at least both setVolatile and getVolatile are ofseq_cst barrier.

Now, I have my doubts that the keyword version is as strong.

The reason they are so obtuse about it is that within chapter 17 they attempt to try to explain both... the lock monitor and the volatile read/writes as if they were similar.

Chapter 17 treats the concept of "Synchronization order" out of nowhere.
It doesn't explain WHAT enforces it or how it even works under the hood.

I know by experience that the keyword is a lock-queue... so it being "totally ordered" is not true for MCS/CLH lock-queues which could very well work perfectly fine with both acquire and release semantics.

But anyways...
Chapter 17.4.3 makes a subtle distinction in my mind...
It states:

"A write to a volatile variable v (§8.3.1.4) synchronizes-with all subsequent reads of v by any thread (where "subsequent" is defined according to the synchronization order)"

Notice the property "synchronization order" is not explicitly granted to the "write to a volatile variable v" action/subject.

This means that the "total order property" that was previously granted to the "synchronization order" concept... is not the same as a volatile read/write as in the paragraph prior, in Chapter 17.4.2 it was implied that both where "synchronization actions"... not order.

17.4.2. Actions

An inter-thread action is an action performed by one thread that can be detected or directly influenced by another thread. There are several kinds of inter-thread action that a program may perform:

  • Read (normal, or non-volatile). Reading a variable.

  • Write (normal, or non-volatile). Writing a variable.

  • Synchronization actions, which are:

    • Volatile read. A volatile read of a variable.

    • Volatile write. A volatile write of a variable.

Then, in the next chapter, the "total order" property is given to the concept of "synchronization order"... but not actions.

17.4.3. Programs and Program Order

Among all the inter-thread actions performed by each thread t, the program order of t is a total order that reflects the order in which these actions would be performed according to the intra-thread semantics of t.

Which makes me guess... that what they are trying to talk about in this paragraph is about the synchronize keyword... aka the monitor/CLH queue.

In which case... YES... it behaves as a seq_cst barrier no doubt about that...

Now... going back to the first quote:

"A write to a volatile variable v (§8.3.1.4) synchronizes-with all subsequent reads of v by any thread (where "subsequent" is defined according to the synchronization order)"

The fact that the documentation uses the word "variable v" implies a monotonic-base sequencing defined by a "per-address sequential consistency", which... as far as I understand... is the BASE Program Order sequencing respected by ALL memory model/processors (bare metal) ... no matter how weak or strong they are.

And if any JIT or compiler disobeys this principle... then I recommend no one should be using that implementation anyways...

Based on the phrase "all subsequent reads of v" strongly implies that the barrier is anchored by the dependency chain of the address v (monotonic dependency chain).

Hence this is explicitly defined as a release since nonrelated ops on other addresses that are not v... are still allowed to be reordered before the release.

(To me) the usage of the word "v" is the hint that the volatile keyword is an acquire/release barrier.

If not... then the documentation needs to provide more explicit wording.
But this is not just a Java issue... even within the Linux Kernel... the concept of barriers/ fences and synchronization gets mixed up... so I don't blame them.

Reasons:
  • RegEx Blacklisted phrase (2): I have my doubt
  • Long answer (-1):
  • Has code block (-0.5):
Posted by: Delark

79775075

Date: 2025-09-25 16:10:13
Score: 3.5
Natty:
Report link

dude, more than 5 years after and you've helped me solve my problem. Thank you very much!!!, be blessed!

Reasons:
  • Blacklisted phrase (0.5): Thank you
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: Pedro H

79775069

Date: 2025-09-25 16:06:12
Score: 2
Natty:
Report link

The command used for broadcasting was wrong.

The correct command is:

am broadcast -n com.ishacker.android.cmdreceiver/.CmdReceiver --es Cmd "whoami"

The -n flag specifies the component name explicitly. Without it, the broadcast may not be delivered correctly to the receiver, and trying to get extras with intent.getStringExtra() will result in it returning null.

Thanks @Maveňツ for posting the suggestion in the comments.

Reasons:
  • Blacklisted phrase (0.5): Thanks
  • Has code block (-0.5):
  • User mentioned (1): @Maveňツ
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: IsHacker

79775066

Date: 2025-09-25 16:04:12
Score: 0.5
Natty:
Report link

It's been a few years since the question was asked, but since no good answer emerged, here's how I do it:

Overview - use git's global config

I use git's global config to store remote config blocks with fetch and push URLs, fetch and push refspecs, custom branch.<name>.remote routes, merge settings, etc.

The global config contains a config file per project, which gets included into $HOME/.gitconfig conditionally using [Include] and [IncludeIf] blocks.

Example:

[includeIf "gitdir:ia2/website/.git"]
    path=ia2/website.config

[includeIf "onbranch:cf/"]
    path=cloudflare-tests.config

In this example, the file $HOME/.gitconfigs/ia2/website.config is automatically included when I work on files in the $HOME/proj/ia2/website directory, which is the website for the ia2 project.

Also, in any project, I can create a branch named "cf/..." which causes the cloudflare-tests.config file to be included in git's configuration, which routes that branch to a repo I have connected to Cloudflare Pages. This allows any of my project to be pushed to a Cloudflare Pages site by simply creating an appropriate "cf/" branch in that project.

How the .gitconfigs scheme works

The local config (ie, the .git/config file present in each clone) doesn't contain any repo configuration, other than things that accidentally end up there. Any settings I want to keep and duplicate on other machines are moved from the local .git/config to the global $HOME/.gitconfigs/$PROJECT.config file.

Since all configs for all my projects live under the same $HOME/.gitconfigs directory, this directory is itself a git repository, which I push to github, and fetch on all machines where I need it.

Keeping all cloned repos on synchronized git config

I have a repository named .gitconfigs at github, and I clone this in the $HOME directory of every machine I develop on.

Each one of the projects I'm working on has its corresponding $project.config file maintained in a branch with the same name as the project, and there are some config files that are included in all projects, like the cloudflare example I gave above.

Configs for private projects

The scheme is capable of maintaining a mix of private and public projects. Configs for public projects is pushed to my public .gitconfigs repo, and the private projects get pushed elsewhere. In a company setting, your devteam might maintain a .gitconfigs private repo for.

My implementation of the solution

You're welcome to inspect or fork my .gitconfigs repo at https://github.com/drok/.gitconfigs - give me a click-up if this helps you, and I welcome pull-requests. I currently have public configs for curl, git, transmission, gdb and internet archive. One benefit of sending a PR is that I can give you feedback on the whatever project you're adding. I've been using this technique for a year with huge time savings results. No more losing project-specific repo settings for me.

Reasons:
  • Blacklisted phrase (0.5): I need
  • RegEx Blacklisted phrase (1): I want
  • Long answer (-1):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Jaredo Mills

79775049

Date: 2025-09-25 15:44:05
Score: 4
Natty: 5.5
Report link

Why you use Breeze with Backpack?! Backpack have authorization from box. You must remove Breeeze - not needed!

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Single line (0.5):
  • Starts with a question (0.5): Why you use
  • Low reputation (1):
Posted by: Alex Cool

79775047

Date: 2025-09-25 15:42:04
Score: 0.5
Natty:
Report link

I faced this problem in wsl2.

Check the permission:

ls -l /var/run/docker.sock

Correct the permission:

sudo chgrp docker /var/run/docker.sock;
sudo chmod 660 /var/run/docker.sock;

And reset to factory default the docker.

Then, In Powershell:

wsl --shutdown

After doing this you can see

docker ps
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: tetthys

79775028

Date: 2025-09-25 15:22:58
Score: 1
Natty:
Report link

I just finally got this to work. I had tried all the documentation that you reference without success. This time around I used the PowerShell script included in this Snowflake quick start to setup the Oauth resource and client app.

https://quickstarts.snowflake.com/guide/power_apps_snowflake/index.html?index=..%2F..index#2

After using the PowerShell script to setup the enterprise apps I was still getting the bad gateway error. In my case it turns out that Power Automate was successfully connecting to Snowflake but was failing to run this connection test.

USE ROLE "MYROLE";

USE WAREHOUSE "COMPUTE_WH";

USE DATABASE "SNOWFLAKE_LEARNING_DB";

USE SCHEMA "PUBLIC";SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = 'PUBLIC'

-- PowerPlatform-Snowflake-Connector v2.2.0 - GET testconnection - GetInformationSchemaValidation

;

I had created a Snowflake trail account to test the Oauth connection and in that account the COMPUTE_WH warehouse was suspended. As a result the test connection query was failing. After discovering that Power Automate was successfully connecting to Snowflake I just do proper setup on the Snowflake side to get the query to run (create running warehouse, database, schema, table all usable by specified user and role).

Here are somethings to check:

  1. If you have access to Entra ID check the sign-in logs under the service principal sign-ins tab. Verify your sign-in shows success.

  2. In Snowflake check the sign-in logs for the user you created.
    SELECT * FROM TABLE(information_schema.login_history()) WHERE user_name = '<Your User>' ORDER BY event_timestamp DESC;

  3. Verify that you created user has default role, warehouse and name space specified.

  4. If Power Automate was able to login check the query history for your user and see if/why the connection test query failed.

  5. If Power Automate is successful in connecting to Snowflake but failing to run the connection test query you could try Preview version of Power Automate Add Connection window. I see it has a check box you can skip the connection test.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: Snowflake Tony

79775026

Date: 2025-09-25 15:20:57
Score: 2
Natty:
Report link

As of 2012, WS-SOAPAssertions is a W3C Recommendation. It provides a standardized WS-Policy assertion to indicate what version(s) of SOAP is supported.

For details on how to embed and reference a policy inside a WSDL document, refer to WS-PolicyAttachment.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: user31571297

79775025

Date: 2025-09-25 15:19:57
Score: 4
Natty:
Report link

Images and Icons for Visual Studio

https://learn.microsoft.com/en-us/visualstudio/extensibility/ux-guidelines/images-and-icons-for-visual-studio?view=vs-2022

Reasons:
  • Probably link only (1):
  • Low length (2):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Alparslan ŞEN

79775017

Date: 2025-09-25 15:14:55
Score: 2.5
Natty:
Report link

Nuxt does not have a memory leak but Vue 3.5 is known to have one. It should be resolved when Vue 3.6 is released, or possibly you can pin to Vue 3.5.13 (see https://github.com/nuxt/nuxt/issues/32240).

Reasons:
  • Low length (1):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Daniel Roe

79775014

Date: 2025-09-25 15:13:54
Score: 0.5
Natty:
Report link

Hi, Why dot_product over cosine?

Dot product is computationally faster for unit vectors since cosine similarity of unit vectors equals their dot product, but Elasticsearch can optimize the calculation. For unit vectors: cosine(A,B) = dot(A,B) since ||A|| = ||B|| = 1.

Updated Elasticsearch Mapping (v8+)

{
  "mappings": {
    "properties": {
      "vector_field": {
        "type": "dense_vector",
        "dims": 384,  // your vector dimensions
        "similarity": "dot_product"
      }
    }
  }
}
Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: Alex Salgado

79775003

Date: 2025-09-25 14:58:51
Score: 1
Natty:
Report link

Your approach can cause high memory usage with large integers, as it creates a sparse array filled with undefined values. The filter step also adds unnecessary overhead. For large datasets, it's inefficient compared to JavaScript's built-in .sort() or algorithms like Counting Sort or Radix Sort for specialized cases. Stick with .sort() for practicality and performance.

Reasons:
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: X Coder

79774997

Date: 2025-09-25 14:55:50
Score: 0.5
Natty:
Report link

Based on your setup, the inconsistency on the latency that you're experiencing possibly points toward a routing or proxy behavior difference between the external Application Load Balancer and the Classic version, rather than just a misconfiguration on your end. Though both load balancers function in Premium Tier and utilizes Google's global backbone for low-latency anycast routing through GFEs, their internal architecture are not exactly the same. For an instance, your External Load Balancer's Envoy layer with its dynamic default load balancing algorithm may re-route using alternative GFEs during intercontinental hops (for example, your test of Asia to Europe) when minor congestion occurs, which explains the 260ms-1000ms fluctuations. Meanwhile, the Classic Load Balancer sticks to a simpler, single-optimized path, minimizing fluctuations thus the consistent RTT from Seoul to europe-west2.

It might also be worth getting Google Cloud Support with all your findings to identify if this is related to a larger network problem or internal routing issue.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: トトロ

79774996

Date: 2025-09-25 14:55:50
Score: 0.5
Natty:
Report link

Your POST became a GET because of an unhandled HTTP redirect.

Your GKE ingress redirected your insecure http:// request to the secure https:// URL. Following this redirect, your requests client automatically changed the method from POST to GET, which is standard, expected web behavior.

You may try to fix the API_URL in your Cloud Run environment variable to use https:// from the start. This prevents the redirect and ensures your POST arrives as intended.

To reliably trace this, inspect the response.history attribute in your Cloud Run client code. This will show the exact redirect that occurred.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: shiro

79774989

Date: 2025-09-25 14:52:49
Score: 1.5
Natty:
Report link

My polyfills got dropped when I upgraded angular and they needed to get re-added to angular.json (specifically, it was the angular localize line)

"polyfills": [
              "zone.js",
              "@angular/localize/init"
            ],

Source:
https://stackoverflow.com/a/76100353/4172413

Reasons:
  • Blacklisted phrase (1): stackoverflow
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Brian Davis