Each idea is appended as a tidy block with timestamp, title, DoD, To Do, Done, and a separator. Commits automatically (optional push).
"$( (hledger accounts --directive --types) -join "`n" )"
because it does not depend on $OFS and makes your intent clear.
.zoom {
transition: 1s ease-in-out;
}
.zoom:hover {
transform: scale(2);
}
<div class="content">
<div class="zoom">
<img alt="Paperplane" src="https://via.placeholder.com/200" style="width: 200px; height: 62px">
<br> <span class="caption">A paper plane</span>
</div>
</div>
Really helpful blog with clear and practical tips for anyone looking to sell their vehicle. Appreciate the focus on safety and DVLA guidance—great resource, thank you!
read more about;https://www.sellmycartoday.uk/blog/
The following code works for Springboot 3.3.4 and Spring Security 6.3.3.
@Bean
SecurityFilterChain filterChain(HttpSecurity http) throws Exception {
return http.csrf(csrf -> csrf.ignoringRequestMatchers(new AntPathRequestMatcher("/api/**"))).build();
}
As @Dogbert pointed out, use r a in magit status buffer.
And if you have other keybindings like evil, you could type M-x magit-rebase-abort, which is the execution command of r a.
As mentionned by @yshavit and Benjamin W., there is a deprecation message feature for inputs to actions, but not for workflow inputs.
i use imagick, you can see there working flow https://aitextured.com/image_converter/ :
function convertRasterWithImagick(string $src, string $to, array $opts): string {
$im = new Imagick();
$im->readImage($src);
$to = strtolower($to);
if ($to === 'jpg' || $to === 'jpeg') {
$quality = (int)($opts['quality'] ?? 90);
$bgHex = (string)($opts['bg'] ?? '#ffffff');
if ($im->getImageAlphaChannel()) {
[$r,$g,$b] = parseHexColor($bgHex);
$canvas = new Imagick();
$canvas->newImage($im->getImageWidth(), $im->getImageHeight(), new ImagickPixel("rgb($r,$g,$b)"));
$canvas->compositeImage($im, Imagick::COMPOSITE_OVER, 0, 0);
$im->destroy();
$im = $canvas;
}
$im->setImageFormat('jpeg');
$im->setImageCompression(Imagick::COMPRESSION_JPEG);
$im->setImageCompressionQuality($quality);
$tmp = tempnam(sys_get_temp_dir(), 'conv_').'.jpg';
$im->writeImage($tmp);
$im->destroy();
return $tmp;
}
if ($to === 'png') {
$im->setImageFormat('png');
$im->setImageCompression(Imagick::COMPRESSION_ZIP);
$im->setImageCompressionQuality(0);
$tmp = tempnam(sys_get_temp_dir(), 'conv_').'.png';
$im->writeImage($tmp);
$im->destroy();
return $tmp;
}
$im->destroy();
throw new RuntimeException('Unsupported target format for raster: '.$to);
}
In addition to Niresh answer:
<style>
ins.adsbygoogle[data-ad-status="unfilled"] {
display: none !important;
}
</style>
Since the <ins> in the snippet doesn't have data-ad-status="unfilled", you'll get "Unused CSS selector" warning and Svelte will purge the CSS automatically. We can prevent it by having a conditional element that uses the selector in question that will never render:
{#if false} <!-- prevent svelte from purging "Unused CSS selector" -->
<ins class="adsbygoogle" data-ad-status="unfilled"></ins>
{/if}
fpdf2 2.8.4 supports table - fpdf 1.7.2 doesn't
fpdf2 is a fork from fpdf and both share the same namespace. Your code must be using fpdf instead of fpdf2, that's why you are getting this error.
The easiest way to fix your issue is uninstalling fpdf 1.7.2 to make sure fpdf2 will be used. If you need the legacy version for a specific use case you might need to setup different virtual environments to manage your code dependencies.
Have a conditional element that uses the selector in question that will never render:
{#if false}
<div class="red" ></div>
{/if}
I have an Angular Projekt and installed Tailwind 4 like described in the Framework Guide on the tailwind website.
But intellisense did not work.
Then i read again the instructions and saw that the stylesheet file, where i have to input - @import "tailwindcss" - must be a .css file.
So i had a .scss file. I changed this to .css and suddenly intellisense worked.
So it seems that tailwind 4 doesn't work with other stylesheet-file than .css
That's really bad, because for example the styling of Angular Material Components like it is recommended on the Angular Material Website works only with .scss.
If anyone have a solution for that problem, please write.
const datePicker = document.getElementById("date-picker");
datePicker.min = getDate();
datePicker.max = getDate(14);
// Borrowed from https://stackoverflow.com/a/29774197/7290573
function getDate(days) {
let date;
if (days !== undefined) {
date = new Date(Date.now() + days * 24 * 60 * 60 * 1000);
} else {
date = new Date();
}
const offset = date.getTimezoneOffset();
date = new Date(date.getTime() - (offset*60*1000));
return date.toISOString().split("T")[0];
}
<input id="date-picker" type="date" autocomplete="off" />
hmmm... it looks like you can just combine several JFRs with simple
cat ./profile-* > /tmp/profile.jfr
and then generate a heatmap for all of the profile files:
jfrconv -o heatmap ./profile.jfr /tmp/heatmap.html
at least it works and the result heatmap looks okay (it shows a graph for several hours), but I'm not sure if I didn't lose any data with this approach.
I have been trying to solve this issue for 2 hours now. The only diff I did was
array:string:[]
Thank you. Your solution worked.
Databricks has advanced their built-in support for working with spatial data.
Instead of storing and processing WKT (well-known text) formatted vector data, you should use built-in functions "ST_GeomFromWKT / ST_GeomFromText" to convert the data to GEOMETRY columns. All spatial data processing should be done with these data types. The geo data types will store spatial statistics (bounding boxes) that allow for efficient query execution, by leveraging stats to perform file skipping.
Also, Databricks support for ST_INTERSECTS and ST_CONTAINS are now accelerated by an in-memory spatial index. You can experiment with BROADCAST hint if working with larger scales, and know that the Databricks team is working on further improvements that will remove the need to use any hinting.
This is the simple answer to your question. No need for PowerShell or complex for-commands:
dir /O:-D
/O sorts by order, -D is by date descending.
Yes, it is possible in Amazon QuickSight,
You'll need to create a calculated field that:
Filters records within your date range
Finds the maximum date within that filtered set
Returns the corresponding string value for that maximum date
Groups by pID and cID
Example with Your Data :
Given your table with:
pID: 5055, cID: 41
mindate: 9/27/2025
maxdate: 9/30/2025
The calculated field will:
Filter records where Date is between 9/27/2025 and 9/30/2025
Find the maximum date in that range (9/30/2025)
Return "StringB" for the row with date 9/30/2025
Return null for all other rows
External links are used when you interconnect brokers/clusters between different regions.
When you you want to achieve horizontal scaling you can create a cluster of brokers via internal links.
Thank you, this is working for me.
A single machine instruction can, in principle, be referenced by two ASM codes. This is not a problem in itself, but it does imply that the relation between assembly and machine code is not necessarily 1:1. In other words, it is not necessarily a bijection. Such relation must be a function, though. There cannot be a single code in ASM that represents two or more machine instructions.
I got the same error after a storage update on my vs code side panel. In the terminal , I executed the "claude" command, and it prompted me with this configuration error. I selected the "reset with default configuration" which fixed my issue for the side-panel extension.
vs code version: 1.104.3 claude cli version: 1.0.128 I'm on Ubuntu 24.04.3 LTS (noble)

Allow me to add a GLORIOUS (but not perfect) solution to your list.
If you put an imported attribute without the full prefix within .. autosummary:: or after .. autodata::, Sphinx will insert the docstring of the attribute's type.
For example, if you write something like:
.. automodule:: rogue_scroll
.. autosummary::
SYLLABLES
.. autodata:: SYLLABLES
You would get something like:
It is not easy to automatically (not hardcode) get the full path of an imported attribute. Even inside a template, you do not have easy access to that information. But I am here to the rescue!
It is a three-step solution:
sphinx-autogen and template magic.sphinx-autogen is a command-line tool shipped with Sphinx. It is also used automatically under the hood by the directive .. autosummary:: with the option :toctree:.
I am using Python 3.13 and Sphinx 8.2.3.
├─ src/
│ └─ rogue_scroll/
│ ├─ __init__.py
│ └─ _scroll.py
└─ docs/
├─ Makefile
├─ *build/
└─ source/
├─ conf.py
├─ index.rst
├─ modules.rst
├─ autogen_attributes.rst
├─ *_attributes/
├─ *_autosummary/
├─ _static/
└─ _templates/
└─ autosummary/
├─ attributes.rst
└─ module.rst
*Generated during build
The attribute lists will be placed in the directory docs/source/_attributes/.
The final documentation will be generated in docs/build/.
Open the file docs/build/html/index.html to visualize it.
SOURCE_DIR := source
BUILD_DIR := build
TEMPLATES_DIR := $(SOURCE_DIR)/_templates
AUTOSUMMARY_DIR := $(SOURCE_DIR)/_autosummary
ATTRIBUTES_DIR := $(SOURCE_DIR)/_attributes
ATTRIBUTES_LIST := $(ATTRIBUTES_DIR)/list.rst
AUTOGEN_ATTR_FILE := $(SOURCE_DIR)/autogen_attributes.rst
.PHONY: html attributes
html: $(ATTRIBUTES_LIST)
sphinx-build -M html $(SOURCE_DIR) $(BUILD_DIR) -v -a -E
$(ATTRIBUTES_LIST): attributes
$(file > $(ATTRIBUTES_LIST),$(subst $(eval ) ,,\
$(foreach FILE,$(wildcard $(ATTRIBUTES_DIR)/*),\
$(firstword $(file < $(FILE))))))
attributes: $(AUTOGEN_ATTR_FILE)
rm -rf $(ATTRIBUTES_LIST)
export PYTHONPATH=../src && sphinx-autogen -i -t $(TEMPLATES_DIR) $<
When you execute make html to build the documentation, the Makefile will execute the following commands:
rm -rf source/_attributes/list.rst
export PYTHONPATH=../src && sphinx-autogen -i -t source/_templates source/autogen_attributes.rst
sphinx-build -M html source build -v -a -E
import os
import sys
sys.path.insert(0, os.path.abspath('../../src'))
project = "Project name"
author = "Author name"
version = "1.0.0"
release = version
copyright = f"2025, {author}"
extensions = ['sphinx.ext.autodoc','sphinx.ext.autosummary']
templates_path = ['_templates']
exclude_patterns = ['build', '_attributes', 'autogen_attributes.rst']
html_theme = 'sphinx_rtd_theme'
html_static_path = ['_static']
autosummary_imported_members = True
Note that autogen_attributes.rst and _attributes/ are listed in exclude_patterns, because they are only used to generate the attributes list and must not be included in the final documentation.
Also, I am using the "Read The Docs" theme because it is pretty :). You can install it executing pip install sphinx-rtd-theme.
Main page title
===============
.. toctree::
:maxdepth: 3
modules
Main file of the documentation.
Modules
=======
.. autosummary::
:toctree: _autosummary
:template: module
:recursive:
rogue_scroll
This file generates documentation for each module using the template docs/source/_templates/module.rst.
The reST files generated during build will be placed in docs/source/_autosummary.
.. autosummary::
:toctree: _attributes
:template: attributes
:recursive:
rogue_scroll
rogue_scroll._scroll
This is the main file parsed by sphinx-autogen to generate an attribute list for each module using the template docs/source/_templates/attributes.rst.
Usually, you would include only the top-level package (rogue_scroll), but Sphinx does not include private modules (any file starting with _) by default, so I had to include rogue_scroll._scroll manually.
{# PART 1 --------------------------------------------------------------------#}
{% for a in attributes -%}
{{ '%s.%s,' % (fullname, a) -}}
{% endfor -%}
{# PART 2 --------------------------------------------------------------------#}
{{ ',\n\n.. automodule:: %s\n\n' % fullname -}}
{% if modules -%}
{{ ' .. autosummary::\n'
' :toctree:\n'
' :template: attributes\n'
' :recursive:\n\n' -}}
{% for m in modules -%}
{{ ' %s\n' % m -}}
{% endfor -%}
{% endif -%}
This template is used to generate an attribute list for each module.
{{ fullname | underline }}
{# PART 1 --------------------------------------------------------------------#}
{% set attributes_file -%}
{% include '../_attributes/list.rst' -%}
{% endset -%}
{% set global_attributes = attributes_file.split(',') -%}
{% for a in global_attributes -%}
{% if a == '' -%}
{% set _ = global_attributes.pop(loop.index0) -%}
{% endif -%}
{% endfor -%}
{# PART 2 --------------------------------------------------------------------#}
{% set imported_attributes = [] -%}
{% for m in members -%}
{% if m not in modules ~ functions ~ classes ~ exceptions ~ attributes -%}
{% set outer_loop = loop -%}
{% for a in global_attributes -%}
{% if m == a.split('.')[-1] -%}
{% set _ = imported_attributes.append((a, m)) -%}
{% endif -%}
{% endfor -%}
{% endif -%}
{% endfor -%}
{# PART 3 --------------------------------------------------------------------#}
{{ '.. automodule:: %s' % fullname -}}
{% if attributes or imported_attributes -%}
{{ '\n\n .. autosummary::\n\n' -}}
{% for a in attributes -%}
{{ ' ~%s\n' % a -}}
{% endfor -%}
{% if imported_attributes -%}
{% for a in imported_attributes -%}
{{ ' ~%s\n' % (a[0].replace('%s.' % fullname, '', 1)) -}}
{% endfor -%}
{% endif -%}
{% endif -%}
{# PART 4 --------------------------------------------------------------------#}
{% if attributes or imported_attributes -%}
{% for a in attributes if a in members -%}
{{ '\n.. autodata:: %s\n' % a -}}
{% endfor -%}
{% if imported_attributes -%}
{% for a in imported_attributes -%}
{{ '\n.. py:data:: %s\n\n' % a[1] -}}
{{ ' .. autodata:: %s\n' % a[0] -}}
{{ ' :noindex:\n' -}}
{% endfor -%}
{% endif -%}
{% endif -%}
{# PART 5 --------------------------------------------------------------------#}
{{ '\n\nname: %s\n\n' % name -}}
{{ 'objname: %s\n\n' % objname -}}
{{ 'fullname: %s\n\n' % fullname -}}
{{ 'objtype: %s\n\n' % objtype -}}
{{ 'module: %s\n\n' % module -}}
{{ 'class: %s\n\n' % class -}}
{{ 'members: %s\n\n' % members -}}
{{ 'inherited_members: %s\n\n' % inherited_members -}}
{{ 'functions: %s\n\n' % functions -}}
{{ 'classes: %s\n\n' % classes -}}
{{ 'exceptions: %s\n\n' % exceptions -}}
{{ 'methods: %s\n\n' % methods -}}
{{ 'attributes: %s\n\n' % attributes -}}
{{ 'modules: %s\n\n' % modules -}}
{{ 'global_attributes:\n' -}}
{% for a in global_attributes -%}
{{ ' %s,\n' % a -}}
{% endfor -%}
{{ '\n' -}}
{{ 'imported_attributes:\n' -}}
{% for a in imported_attributes -%}
{{ ' (%s, %s),\n' % a -}}
{% endfor -%}
{{ '\n' -}}
This template is used to generate the documentation for each module.
global_attributes.global_attributes were imported and create the imported_attributes list.rogue_scroll.SYLLABLES is available. However, I cannot use directive .. autodata::, as mentioned in the problem section. So I decided to use .. py:data:: for the top-level attribute and, inside it, .. autodata:: for the full path attribute to get the correct docstring."""rogue_scroll docstring"""
from ._scroll import SYLLABLES
from ._scroll import SCROLL_PROBS
"""_scroll docstring"""
SYLLABLES = 123
"""SYLLABLES docstring"""
SCROLL_PROBS = 123
"""SCROLL_PROBS docstring"""
I created this answer specifically for your case (Merry Christmas!), but I posted a more complete and generic example on my GitHub. It is licensed under the "do whatever the f*** you want" license (MIT).
This solution is not absolute. It is just a result of many days of research and template tinkering. 100% AI-free. If you can improve this solution somehow, please let me know!
bumping because im having the same issue (although recording is from screen and not webcam)
Im being forced to screen record using .webm which does not include metadata - which honestly is a pain
any ideas?
What you want to do is analogous to having a pile of books and wanting to place a new one at the bottom. You’d have to lift the whole pile up, place the new book down, and the pile on top of it.
Now, you’ve changed the position of each book in the original pile by one, so what you’ve done isn’t as nontrivial as putting a book on top of the pile (read push back on a vector). But, you’ve only moved books around without disturbing their contents. So it’s also not all that heavy.
Note that a vector in itself is a light object, essentially just a pointer to an array with bells n whistles. By adding a new vector to the head, you’re just shuffling around these lightweight objects - not the actual content pointed to by the vectors in your super vector.
As to how impactful that would be - this cannot be answered in the abstract, but rather depends on how often you’re doing it along with the idiosyncrasies of your target system. The only way to get a meaningful answer is to profile your code
There are several formulations and exact solvers for MWCS problem. It is commonly reduced to MIP.
Example paper: https://arxiv.org/pdf/1605.02168
The algorithm is implemented in java and solves three variants of the problem:
https://github.com/ctlab/virgo-solver
An R package:
just add this
validRange: function(nowDate) {
return {
start: nowDate,
};
},
I still have the same issue and no matter what people write about it in denial it happens to me in optimized release builds from App Store even with Impeller (in MacOS builds as well) enabled and even now in the end of year 2025. First "push" is always slow and janky.
So answering original question: "Are Transition animations supposed to be 'Janky' the first time they are run in Flutter?".
YES! Unfortunately, they ARE! This is still an issue that is ignored by the Flutter team!
Flutter (Channel stable, 3.35.1)
Python worker is getting killed mid stream when the data is serialized and sent via socket to the python worker. For some reason, the python worker crashes unexpectedly and the socket is closed. JVM is reporting the same error to the driver.
Can you trying increasing the overhead memory?
I do not know yet why this happened but vscode did not know the location of my python interpreter inside virtual environment managed by uv. Anyways, I used ctrl-shift-P and searched for interpreter, then I selected Python: Select Interpreter option and I manually added the path of my .venv/bin/python and I just worked. Thank you for helping.
I'm running into the same issue, did you find a solution for this?
it would seem of_led_get() is depreciated and the use of devm_of_led_get() is the correct function to use.
switching to this function fixed my problem, it would seem the function was removed between building my code under Linux and OpenWrt code.
Your waveform is the ESC's 21-bit RLL telemetry frame (after the ~30µs low break). Assuming DShot300 (~1.25µs short high / ~2.5µs long high).
1. Start at first rising edge post-break (line idles low).
2. Mark 21 bits: each from rising edge to next (~3.3-3.8µs).
3. Per bit, measure high pulse width:
≤1.6µs: 0
≥2.0µs: 1 (Jitter OK; ignore lows.)
Your snippet: 1.25µs high = 0. Zoom out for full 21—should match `010001011101000101110` for 0 eRPM.
For ease, load into Saleae with DShot decoder. If bits flip, check inversion.
what's the context of this referenece error, a screenshot would help.
Placing the line:
export GDK_PIXBUF_MODULE_FILE="usr/lib/gdk-pixbuf-2.0/2.10.0/loaders.cache"
into the AppRun script fixes the problem.
The default view under everything is black. So your tab bar is only over a black view if your ViewController doesn't go under it.
You can use messaging.send_each_for_multicast(message) instead of send_multicast(message). This sends the message to each device individually and handles failures per device.
is there a way to create array of Any?
Here are two ways to create an array of Any initialized with the named tuple (;some=missing):
fill!(Vector{Any}(undef, 10), (;some=missing)) # fill! returns its first argumentAny[(;some=missing) for _ in 1:10]These forms are not interchangeable when the value of the filling expression is mutable. The first with the fill! expression will use the same value for all elements. The second with the array comprehension will create a separate value for each element. For example, if the value expression is [], the first creates one empty mutable array used in all elements, and the second creates a separate empty mutable array for each element.
@allocated fill!(Vector{Any}(undef, 10), []) # 176 (bytes allocated)@allocated Any[[] for _ in 1:10] # 432 (bytes allocated)Why
Vector{Any}((; some=missing), 10)fails?
The expression Vector{Any}((; some=missing), 10) fails because no method is defined for this case.
Constructor methods are only defined (as of Julia 1.12.0) for
Here is a try to define one:
Vector{T}(value::V, n) where {T, V <: T} = fill!(Vector{T}(undef, n), value)
With this definition, the expression Vector{Any}((; some=missing), 10) works.
I found this on a google search:
npx playwright install --list
Google search results
if somebody is facing the same issue with firabase emulator for an expo app running on ios simulator, I got it fixed with below steps.
add host to the firebase.json in firebase side
"emulators": {
"functions": {
"port": 5001,
"host": "0.0.0.0"
},
"firestore": {
"port": 8080,
"host": "0.0.0.0"
},
"ui": {
"enabled": true
},
"singleProjectMode": true,
"export": {
"path": "emulator-data"
}
}
and then in expo side, where you initialize firebase apps
import Constants from "expo-constants";
import { getApps, initializeApp } from "firebase/app";
import { getAuth } from "firebase/auth";
import { connectFirestoreEmulator, getFirestore } from "firebase/firestore";
import { connectFunctionsEmulator, getFunctions } from "firebase/functions";
const firebaseConfig = {
apiKey: process.env.EXPO_PUBLIC_FIREBASE_API_KEY,
authDomain: process.env.EXPO_PUBLIC_FIREBASE_AUTH_DOMAIN,
projectId: process.env.EXPO_PUBLIC_FIREBASE_PROJECT_ID,
storageBucket: process.env.EXPO_PUBLIC_FIREBASE_STORAGE_BUCKET,
messagingSenderId: process.env.EXPO_PUBLIC_FIREBASE_MESSAGING_SENDER_ID,
appId: process.env.EXPO_PUBLIC_FIREBASE_APP_ID,
};
const app = getApps().length ? getApps()[0] : initializeApp(firebaseConfig);
const origin = Constants.expoConfig?.hostUri?.split(':')[0] || 'localhost';
export const db = getFirestore(app);
export const auth = getAuth(app);
export const functions = getFunctions(app, "asia-south1");
if (__DEV__ && process.env.EXPO_PUBLIC_USE_EMULATORS === "1") {
console.log(`🔌 Using local Firebase emulators... ${origin}`);
connectFunctionsEmulator(functions, origin, 5001);
connectFirestoreEmulator(db, origin, 8080);
}
export default app;
I found the problem. In the image there was a flag set to read from bottom-to-top instead of top-to-bottom and another left to right flag was wrongly set
Have you consired looking on x32ABI Architecture? It literally adresses this problem. It takes advantage of 64-bit instructions with 32-bit pointers to avoid memory waste (overhead)
I understand your question is multi-layered, and the run-time crashing of your custom strlen() is nearly a side note, but I thought I'd address just this one aspect nonetheless. Does your code care for the possibility of a NULL parameter as does the following custom strlen()?
Runnable code here: https://godbolt.org/z/xofh9sYqK
#include <stdio.h> /* printf() */
size_t mstrlen( const char *str )
{
size_t len = 0;
if( str ) /* Prevent run-time crash on NULL pointer. */
{
for(; str[len]; len++);
}
return len;
}
int main()
{
char s[] = "stars";
printf("mstrlen(%s) = %llu\n", s, mstrlen(s));
printf("mstrlen(NULL) = %llu\n", mstrlen(NULL));
return 0;
}
I only deploy my code once or twice a year. I often forget to create the "signed" APK. When you dont' create a signed APK, you will see the build show up in a "debug" folder.
When you select the signed APK
You will see your build in "release" folder as an .abb file. You want to drag the .abb (for me its the most recent file with no sequence number to the Google Play Console release web page.
to view the app traffic you must install the VPN and app user certificate but their is most common problem in android .crt and .der format is not supported VPN and app user certificate i try .p12 certificate and its work try that
$order = Order::where('uuid', $order_id) ->with(['client', 'service'])
->first();
This is the solution that worked for me.
The overridden prompt that you provided is incorrectly formatted. Check the format for errors, such as invalid JSON, and retry your request.
Limit: 5 files (10MB total size)
Format: .pdf, .txt, .doc, .csv, .xls, .xlsx
i am trying to upload pdf instead of jpg so it can be handle currently.
https://www.npmjs.com/package/node-html-parser seems like a good alternative if you don't want to use an offscreen doc, which seems a bit overkill imo.
The fix was to add the following to layouts.js:
export const dynamic = "force-dynamic";
Here is the PR with the fix.
I got this solution after contacting DigitalOcean support.
I managed to find the solution for this.
I removed the 2 lines for QueueProcessingOrder and QueueLimit from my rate limiting logic in RateLimiterExtension.cs file.
Also added app.UseRouting() to my Program.cs file.
My rate limiting functionality now works as desired and returns 429 status code with the message when the number of HTTP requests is limited.
As @david-maze said (thanks!), you need to add steps to the Dockerfile to build in a folder other than the root directory. Add lines like this to your Dockerfile:
COPY cargo.toml build.rs proto src /project
WORKDIR /project
And then you could just add RUN (or not just¹):
RUN cargo build
This will prevent Cargo's scope from scanning unnecessary system's files.
¹
RUNwith cache and release:RUN --mount=type=cache,target=/root/.cargo/registry \ cargo clean; cargo build --release
OVO JE BRZA OBAVIJEST.... BOŽIĆNA PONUDA KREDITA ZA SVE...
Srdačni pozdravi svima koji čitaju ovu poruku i želim da znate da ova poruka nije slučajnost ili koincidencija. Kao uvod, ja sam iz Zagreba i nisam mogao vjerovati kako sam dobio kredit od 300.000,00 eura. Ponovno sam sretan i financijski stabilan i hvala Bogu što takve kreditne tvrtke još uvijek postoje. Znam da su prevaranti posvuda i bio sam žrtva prijevare prije dok nisam upoznao ovu pouzdanu tvrtku koja mi je pokazala sve što trebam znati o kreditima i ulaganjima. Ova tvrtka me savjetovala i pomogla, pa ću savjetovati svima kojima je potreban kredit da iskoriste ovu priliku kako bi se izvukli iz financijskih poteškoća. Možete ih kontaktirati putem e-pošte ([email protected]). Brzo kontaktirajte ([email protected]) danas i uzmite svoj kredit od njih uz kamatnu stopu od 3%. Svakako kontaktirajte Michael Gard Loan Company i svi vaši financijski problemi bit će riješeni. Oni rade s odjelima za procjenu rizika, obradu, financiranje i druge odjele. Svaki od ovih timova dolazi s ogromnim bogatstvom znanja. To im omogućuje da steknu više znanja o kreditima i pruže bolje iskustvo članovima. Postignite svoju financijsku slobodu od njih danas i zahvalite mi kasnije. Jeste li u dugovima, trebate kredit, brz i pouzdan, ovo je mjesto za dobivanje vjerodostojnih kredita. Nude poslovne kredite, studentske kredite, stambene kredite, osobne kredite itd. Kamatna stopa na kredit je 3%. Kontaktirajte nas danas. Imate priliku dobiti gotovinski kredit u iznosu od 1000 (€$£ KUNA) - 6.000.000 - 900.000.000 (€$£ KUNA) s mogućnostima otplate od 1 godine do 45 godina.
WhatsApp: +385915608706
WhatsApp: +1 (717) 826-3251
E-pošta: [email protected]
© 2025 MICHEAL GARD LOAN COMPANY.
Web stranica: https://www.alliantcreditunion.org
11545 W. Touhy Ave., Chicago, IL 60666
Broj rute: 271081528
You might also want to check out mailmergic.com. It can take your Excel data and Word template and generate individual PDFs directly, without needing any VBA or macros.
It’s really straightforward to use and can save a lot of time compared to running Word macros, especially for large merges.
| header 1 | header 2 |
|---|---|
| cell 1 | cell 2 |
| cell 3 | cell 4 |
If you are looking for an API to integrate it directly into you website. You may look into https://aitranslate.in/api-documentation. It has image/PDF translation capability without losing background. Plus it offers more like OCR, erasure etc.
If you are looking for a similar image/PDF to image translation or OCR API similar to the google cloud vision, I would suggest you look into https://aitranslate.in/api-documentation
Try running these commands in your terminal:
node -v
npm -v
If both show version numbers, then Node and npm are installed fine.
If you get an error, install Node.js from https://nodejs.org — npm comes with it.
During installation, make sure the “Add to PATH” option is checked.
If you already installed it but it’s still not recognized, you can manually add this path to your system environment variables:
"C:\Program Files\nodejs\"
Sometimes the error also happens if you’re using PowerShell — try opening Command Prompt (cmd) instead and check again.
Thanks for the help guys, the problem turned out to be the server was not adding the Access-Control-Allow-Origin header. So the server was basically not sending the result back to the client because of a security policy. Once I figured that out everything just worked.
Also, adding the "success:" option turned out to be depreciated, so the done/fail answer below is a great example of how you are supposed to do it now.
Simple solution for programmer, but not optimal for CPU time:
size_t sigfigs(double x)
{
return std::to_string(x).size();//create r-value string and get its size!
}
But you must take into account presence of "-" sign, dot, exponent, their sign and all possible combinations. I think, its impossible to create universal sigfigs() function, only problem specific.
This is now (kind of) supported using the new auto copy feature and modifying the s3 event notification to filter by suffix
2@QsHG2kWyiSv3XtyL8BJjFt3SD/wdFCj3gEytAJ+5ydR4LL+h/i3oXJiOYztfwomWzHoYVsd/eXQQLkp0Hm20cSr5/L6QoCwFSK4=,gp8UKkVmOah17KshhGYYYYnuMx3VYSFsySKjT/FNlic=,/EbDVPBuE3Pyr66vZ2k4AYD4ZpbFMI05dY1srefoO3s=,hQYu6KdbygKRmjgtDa9twXNAahG+W2FEs28FjVOdUz4=,8
You can use dependencies like this which will do references instead of deep cloning at functions:
I did a cross-platform application that clones from and to GitHub/Gitlab/Gitea and Local.
Download section: https://github.com/goto-eof/fromgtog?tab=readme-ov-file#download
As I understand it, the answer is yes, VS Code, somewhat by design. VS Code does not support code running out of the box. The "play" button (in the editor title) is added by extensions ad-hoc. The play button in "Run & Debug" is builtin I think
I just update my SDK tools and then do Invalidate chaches by checking all 3 checkboxes enter image description hereenter image description hereenter image description here
Hello can someone help me add this lib through crosstool-ng?
sudo apt-get install usbmuxd libimobiledevice6 libimobiledevice-utils
I tried to do as mentioned above but without success.
git clone https://github.com/libimobiledevice/libimobiledevice.git
cd libimobiledevice
./autogen.sh --prefix=`pwd`/builds
make
sudo make install
This should be handled via either SysVar or EnvVar.
in Panel, you can assign value to a System variable which can be accessed in CAPL code.
for example,
variables
{
byte xxxxx = 0; //assuming 8 bits data
}
on sysvar sysvar:xxxx // name your system variable
{
xxxxx = @this;
}
on message CAN_0x598
{
if(counterAlive == 15)
{
counterAlive = 0;
}
else
{
counter++;
}
msg1.byte(0) = this.byte(0);
msg1.byte(1) = this.byte(1);
msg1.byte(2) = this.byte(2);
msg1.byte(3) = xxxxx ; // Changing through the Panel
msg1.byte(4) = xxxxx +1; //pay attention on overflow value as it can hold till 255
msg1.byte(5) = this.byte(5);
msg1.byte(6) = counter;
msg1.byte(7) = this.byte(7);
}
probably you’ve solved this, but i faced with your article and decided to share my workaround about the way i’ve solved it also
We initially chose the original next-runtime-env library because we wanted a Docker image that worked for multiple environments (build once, deploy many). However, using the library caused various issues with navigation. First, I encountered a bug when navigating using router.push from the custom not-found page. Then, after integrating ISR to improve performance and splitting different app layers with separate layouts, jumping between them also wiped out my global env variables.
I had two options to solve this:
I hesitated with the first option because in next-runtime-env's server environment, when you use the env() getter to get a variable from process.env[key], it calls unstable_noStore() (aka connection()), which enables dynamic rendering. I wanted to reduce unnecessary dynamic rendering. This caused issues when I moved a room from SSR to ISR. Where dynamic rendering was needed, I decided to fetch data client-side and show skeleton loaders 💅.
A few points about the final implementation for those interested:
Now that I no longer depend on next-runtime-env, I can access runtime env in server components without enabling dynamic rendering. This opens the door for further performance improvements, such as migrating individual app pages from SSR to SSG/ISR.
I had a broken password entry in "passwords and keys" with no description, just "app_id" in details.
After deleting that entry, VS-Code stopped asking for a password.
https://reveng.sourceforge.io/crc-catalogue/16.htm
See CRC-16/IBM-3740
This C++ code works
// width=16 poly=0x1021 init=0xffff refin=false refout=false xorout=0x0000 check=0x29b1 residue=0x0000 name="CRC-16/IBM-3740"
// Alias : CRC-16/AUTOSAR, CRC-16/CCITT-FALSE
unsigned short crc16ccitFalse (const char *pData, unsigned int length, unsigned short initVal /*= 0xFFFF*/)
{
const unsigned short polynomial = 0x1021;
unsigned short crc = initVal;
for (unsigned byte = 0; byte < length; ++byte) {
crc ^= (pData [byte] << 8);
for (int bit = 0; bit < 8; ++bit) {
crc = (crc & 0x8000) ? (crc << 1) ^ polynomial : (crc << 1);
}
}
return crc;
}
One of the reasons. On Linux, you need to check for the line:
127.0.0.1 localhost
in the /etc/hosts file.
declare global {
interface HTMLElement {
querySelector: (paras: any) => HTMLElement;
}
}
One way to make ft_strlen(NULL) be stopped at compiler-time, is to add this line at the top of the code:
__attribute__((nonnull(1)))
This is an attribute, a GCC/Clang compiler-specific directive that tells the compiler "The first argument to this function must not be NULL".
Opening the Firebase Console and viewing your database, whether Firestore or Realtime db, you are reading from the db. Each time you refresh the console or navigate between collections or documents, it will also triggers as new reads. The difference is that you're doing is manually via the console.
| GlobalScope | CoroutineScope |
|---|---|
tied to an entire application lifecycle |
tied to a specific component's lifecycle |
cancellation is difficult and manual |
cancellation is easy and automatic |
| header 1 | header 2 |
|---|---|
| cell 1 | cell 2 |
| cell 3 | cell 4 |
I had this problem. The reason was that in the root of my project there was 'System.ValueTuple ' file . To delete this file fix my problem.
You have to just change to user privileges to root to run the file like you can run the file with
first compile:
gcc filename.c -o out
then run with root:
sudo ./out
I ran into similar issues when I started experimenting with Roblox scripting. Some external references like Poxelio have pretty solid general Roblox tutorials that can help understand the environment setup part.
So I make a summary of some points which methods can do and functions don't, please add comments if you have more points, so that I can update the list:
Methods with pointer receivers can either take a pointer or a value, while functions with e pointer receiver must take a pointer
Methods have to be implemented in order to implement an Interface, this can't be achieved with functions
In my case, starting up Xcode before opening the project did the trick.
I have 2 python files like below:
app.py
from flask import Flask
from routes import register_main_routes
def create_app():
app = Flask(__name__)
register_main_routes(app)
print('------------create_app()')
return app
app = create_app()
if __name__ == '__main__':
print('------------__main__()')
app.run(debug=False,port=5007)
routes.py
# routes.py
def register_main_routes(app):
@app.route('/')
def home():
return "<h1>Hello Flask web !</h1>"
Function TransfertDonnéesTableauVersPressePapier() 'appelée par enregistrer
Application.CutCopyMode = False 'RAZ du Presse-Papiers
Set MyData = New DataObject
Selection.Copy 'SURPRENANT mais seulement pour TEST, en place le 05/oct/2025
'Semble supprimer l'Erreur d'exécution '-2147221040 (800401d0)'
'DataObject:GetFromClipboard Echec de OpenClipboard
ActiveSheet.ListObjects("TabInscriptions").DataBodyRange.Select
Selection.Copy 'SURPRENANT cette répétition
MyData.GetFromClipboard
On Error Resume Next
End Function
To make the loading animation work, you can't just rely on CSS :active because the network request to p.php is asynchronous and takes time, so you must use JavaScript to manage the loading state. The most efficient way is to modify your button's HTML to contain both the normal <span id="buttonText">Send Data</span> and a hidden spinning element (<span id="buttonSpinner" class="spinner">), then your JavaScript listener will immediately start the loading state by hiding the text, showing the spinner, and disabling the button; once the fetch is complete (either successful or failed), the .finally() handler runs to stop the loading state by re-enabling the button, hiding the spinner, and restoring the text, ensuring the animation only runs for the exact duration of the server request.
A lazy way to import modules: after modifying the "import" command.
If you find it useful please leave a star~
Github: https://github.com/Magic-Abracadabra/magic-import
[🎬 Demo](https://github.com/Magic-Abracadabra/magic-import/blob/main/Demo.mp4)
A brief way to use your codes. Just copy the source code to start with.
from pip import main
from importlib.metadata import distributions
installed_packages = [dist.metadata['Name'] for dist in distributions()]
normal_import = __builtins__.__import__
def install(name, globals=None, locals=None, fromlist=(), level=0):
__builtins__.__import__ = normal_import
if name not in installed_packages:
main(['install', '-U', name])
name = normal_import(name, globals, locals, fromlist, level)
__builtins__.__import__ = install
return name
__builtins__.__import__ = install
# start coding from here
Now, the python keyword ✨import✨ has been in a magic spell. Modules of the latest version can be installed before your following imports.
If you don't have the numpy,
import numpy as np
will install it first, and then this module will be successfully imported. Yeah, that easy.
After importing one package, the following libraries will work, too:
import pyaudio, pymovie, pyautogui, ...
The following techniques can make Amazon Q Developer CLI more reliable:
Cross-examining its output with other LLMs like ChatGPT significantly improves quality, often within one or two rounds of back and forth. It functions almost like watching two experts debate.
Providing a reference application that follows best practices helps guide its output.
Manually approving every write operation with a preview prevents unintended changes.
Additional input and other approaches are welcome.
go to
Settings -> Developer Options, under APPS section: "Don't keep activities" was enabled.
Where you able to figure this out by anychance?
The versioning was just wrong, mess around with your firebase versions and adjust until the problem goes away,
With the Global Interpreter Lock (GIL) removed in Python 3.14 (in the no-GIL build), true multi-threading is now possible, which opens the door to data races—where multiple threads access and modify shared data concurrently, leading to unpredictable behavior. Previously, the GIL implicitly prevented many race conditions. Now, developers must handle concurrency explicitly using thread synchronization mechanisms like threading.Lock, RLock, Semaphore, or higher-level tools like queue.Queue. Careful design, such as using immutable data structures, thread-safe collections, or switching to multiprocessing or async paradigms where appropriate, is essential to avoid bugs and ensure thread safety.
I made an account for this. Full explanation and semi-rant at the end. Here's the Windows GUI-centric approach:
First, find your user or "home" folder in Windows. In the File Explorer, click "This PC", then click "Local Disk (C:)" then click Users, then click your name. This is your user or "home" folder. In here, create a new Text Document. Rename it "_vimrc" without the quotations. For a quick test to verify it's working, open that file with Notepad and type ":colorscheme blue" without quotations. Now open Vim and you should notice the bright blue color scheme. To undo this, close Vim, open your _vimrc file and delete what you typed, then save it, re-open Vim, and Vim will return to the default color scheme.
Bear in mind Vim came from Linux which is derived from Unix. I remember when I was new to all this and what helped was using a Linux distribution (Debian) for a while. I noticed a LOT of this type of stuff resides in the "home" folder otherwise referred to as "~". Like Windows but organized differently. So when you're using something like Vim, developed for Unix/Linux, you have to think in that way. Very command prompty (No, it's a CLI! No, it's a terminal!, No, it's a TTY!!!!!), hacker man, power usery vs. Windows which is "Monkey click icon, monkey happy".
I just figured out how to do this vimrc stuff today for myself by poking around in the Vim docs in my Vim install folder, for hours. I found a file, "vimrc_example.vim". At the top it says, "An example for a vimrc file . . . to use it, copy it to" then it lists where to copy "it" to for various operating systems. This is already confusing. Is he (Bram, the creator of Vim) saying copy this file to another location? Well, it's called "vimrc_example.vim" so that assumption must be wrong because I know the file should be something like "vimrc"! Okay, so he means to say, "Copy the text of this document to your vimrc file", right? But what is that file CALLED? Does it exist? And where? Do I need to make it? Where do I put it? We will get there. So he says for Windows, to "copy it to":
$VIM\_vimrc
Yes. There. Right there. Put it in there and you're good. Ha. See Windows never really made us learn this type of stuff like Linux people have to. So, if (no, because) you don't know, $ is symbolic for the location of the install of whatever is named. And the \ means put the following file in there; the vimrc. Breaking that down, we must find Vim's install location (the $) and in that folder (the \) create a file (the illusive _vimrc).
Refer https://github.com/isar/isar/issues/1679#issuecomment-3393987462 for the solution. It worked for me.
We have keys inside the execution object, like the running_count, failed_count, succeeded_count, etc
So we can depend on them to know the status
# This command installs the necessary Python libraries.
# - ollama: The official client library to communicate with our local Ollama server.
# - pandas: A powerful library for loading and working with data from our CSV file.
!pip install -q ollama pandas
In my case it was a fresh VS2022 install. I had to open Android Device Manager for a first time, it should have initialize some stuffs... After that my device showed up!
You can try this lib : https://github.com/webcooking/zpl_to_gdimage
worked for me
5** Http Errors are for the server not client. first you need to sure about your server that works fine. then you have to debug your request with postman.
based on your code that you provided earlier, nothing is wrong. but you must debug your request.
Change public override bool Equals(object obj) to public override bool Equals(object? obj) so that the override matches the method signature of the method that your overriding.
I believe when you open the app from the start menu, it runs through a shortcut that uses the .NET version already on your PC (4.7.2) so it works. But when you double click the exe, it looks for framework 4.8 specifically, and since that version isn’t installed, it fails. Try installing Framework 4.8