/nearby /grant /ban /console /gm /enderchest /donate /prefix /ag /say /mute /kick /kill /kit /advertence
Craft.pe.wed Donations /givemeranksfreemodsholiday30
Run you app with
newArchEnabled=false
in gradle.properties
then dont embed it simple problem solved
I have a similar questing regarding Vega-lite in Power BI. I want to wrap the x-axis label after each ; but this solution didn't work for me. Even without using the Offset function.
"labelExpr": "replace(datum.label, /;\\s*/g, '\\n')"
It removes the ; but doesn't wrap.
{
"$schema": "https://vega.github.io/schema/vega-lite/v5.json",
"data": {"name": "dataset"},
"encoding": {
"x": {
"field": "Bezeichner",
"type": "nominal",
"axis": {
"title": "",
"labelAngle": 0,
"labelAlign": "center",
"labelBaseline": "middle",
"labelLineHeight": 12,
"labelFontSize": 15,
"labelPadding": 10,
"labelLimit": 500,
"labelExpr": "replace(datum.label, /;\\s*/g, '\\n')"
}
}
},
"layer": [
{
"mark": {"type": "bar", "opacity": 1},
"encoding": {
"y": {
"field": "LCC",
"type": "quantitative",
"axis": {"title": "",
"titleColor": "#118DFF",
"orient": "right",
"labelFontSize": 15,
"labelFont": "Roboto",
"labelPadding": 10,
"grid": true}
},
"xOffset": {"value": 57
},
"color": {"value": "#118DFF"},
"tooltip": [
{"field": "Bezeichner", "type": "nominal", "title": "Bezeichner"},
{"field": "LCC", "type": "quantitative", "title": "LCC"},
{"field": "LCA", "type": "quantitative", "title": "LCA"}
]
}
},
{
"mark": {"type": "bar", "opacity":1},
"encoding": {
"y": {
"field": "LCA",
"type": "quantitative",
"axis": {
"title": "",
"orient": "left",
"titleColor": "#AAE574",
"grid": false,
"labelFontSize": 15,
"labelFont": "Roboto",
"labelPadding": 10
}
},
"xOffset": {"value": 8},
"color": {"value": "#AAE574"},
"tooltip": [
{"field": "Bezeichner", "type": "nominal", "title": "Bezeichner"},
{"field": "LCC", "type": "quantitative", "title": "LCC"},
{"field": "LCA", "type": "quantitative", "title": "LCA"}
]
}
}
],
"resolve": {"scale": {"y": "independent"}},
"config": {
"bar": {"size": 50},
"scale": {
"bandPaddingInner": 0.845,
"bandPaddingOuter": 0.5
}
}
}
there is an option in the firebase console.
goto firebase console -> authentication -> settings -> Password policy
and tick all the options you want to implement
This blog post https://babichmorrowc.github.io/post/2019-03-18-alpha-hull/ explains how to do it with this package https://github.com/babichmorrowc/hull2spatial?tab=readme-ov-file. It currently outputs Spatial* class objects, but these can be easily converted to terra 'SpatVector' or to 'sf'.
According to JEP-483:
Class paths must contain only JAR files; directories in class paths are not supported because the JVM cannot efficiently check them for consistency.
To be honest, I am not sure if you even get any advantage of faster startup times during development. I might say the extra work of training and putting things together won’t be worth the effort.
I am facing similar kind of issue, i am trying to establish two way communication between my native c++ plugin and uxp plugin. I have added listeners and senders but it is not working.
I am getting this error -
❌ [UXP-COMM] Failed to register UXP message listener: 1344357988 (0x50214664)
❌ [UXP-COMM] Error details: gPlugInRef=0x11fddd028, UXPMessageHandler=0x37b306550
and similar issue when i try to send message to uxp plugin.
void SendMessageToUXPPanel(const std::string& messageType, const std::string& data) {
if (!sUxpProcs) {
LOG_TRACE((0, "❌ [UXP-COMM] UXP suite not available"));
return;
}
try {
PIActionDescriptor desc;
SPErr err = sPSActionDescriptor->Make(&desc);
if (err != kSPNoError) {
LOG_TRACE((0, "❌ [UXP-COMM] Failed to create descriptor"));
return;
}
// Set message type and data
sPSActionDescriptor->PutString(desc, 'type', messageType.c_str());
sPSActionDescriptor->PutString(desc, 'data', data.c_str());
sPSActionDescriptor->PutString(desc, 'time', std::to_string(time(NULL)).c_str());
// Send to UXP panel
const char* UXP_PLUGIN_ID = "Test-v0qxnk"; // From your manifest.json
err = sUxpProcs->SendUXPMessage(gPlugInRef, UXP_PLUGIN_ID, desc);
if (err == kSPNoError) {
LOG_TRACE((0, "✅ [UXP-COMM] Message sent to UXP panel: %s", messageType.c_str()));
} else {
LOG_TRACE((0, "❌ [UXP-COMM] Failed to send message to UXP panel: %d", err));
}
// Clean up descriptor
sPSActionDescriptor->Free(desc);
}
catch (...) {
LOG_TRACE((0, "❌ [UXP-COMM] Exception in SendMessageToUXPPanel"));
}
}
if (sUxpProcs) {
LOG_TRACE((0, "🔍 [UXP-COMM] UXP suite acquired, registering message listener..."));
LOG_TRACE((0, "🔍 [UXP-COMM] gPlugInRef: %p, UXPMessageHandler: %p", gPlugInRef, UXPMessageHandler));
SPErr err = sUxpProcs->AddUXPMessageListener(gPlugInRef, UXPMessageHandler);
if (err == kSPNoError) {
LOG_TRACE((0, "✅ [UXP-COMM] UXP message listener registered successfully"));
} else {
LOG_TRACE((0, "❌ [UXP-COMM] Failed to register UXP message listener: %d (0x%x)", err, err));
LOG_TRACE((0, "❌ [UXP-COMM] Error details: gPlugInRef=%p, UXPMessageHandler=%p", gPlugInRef, UXPMessageHandler));
}
} else {
LOG_TRACE((0, "❌ [UXP-COMM] UXP suite not available for message listener (suiteErr: %d)", suiteErr));
}
In documentation, we don't have proper information regarding this feature, not sure what to do next.
To handle multiple post-filters in Spring Cloud Gateway, ensure proper filter order using , modify response bodies with in a non-blocking, reactive way, and chain modifications across filters while maintaining performance and thread safety.
[ActiveWorkbook.BuiltinDocumentProperties("Creation Date")] is a date type. It needs to be converted to a string. Do it as follows.
Format(ActiveWorkbook.BuiltinDocumentProperties("Creation Date"), "yyyymmdd_hhnnss")
I hope this is helpful.
You can try dict.fromkeys()
list1 = [1, 2, 3, 4]
list2 = [3, 4, 5, 6]
merged = list(dict.fromkeys(list1 + list2))
print(merged)
output
[1, 2, 3, 4, 5, 6]
In my case I used the proxy pass of nginx for accessing the opensearch vpc endpoint dashboard from public
This might help:
https://repost.aws/knowledge-center/opensearch-dashboards-vpc-cognito
Did you ever get an answer to this?
Hi how or where is the title for a screenshot I had taken and downloaded with lighthouse I need for an evidence to send but I dnt seem to be able to find it can anyone help ???
Try setting nodeLinker to "hoisted" in pnpm-workspace.yaml.
just do this for full reload
window.location.reload()
the error occurs because TensorFlow 2.10.0 isn’t available as a standard wheel for macOS arm64, so pip can’t find a compatible version for your Python 3.8.13 environment. If you’re on Apple Silicon, you should replace tensorflow==2.10.0 with tensorflow-macos==2.10.0 and add tensorflow-metal for GPU support, while also relaxing numpy, protobuf, and grpcio pins to match TF 2.10’s dependency requirements. If you’re on Intel macOS, you can keep tensorflow==2.10.0 but still need to adjust those dependency pins. Alternatively, the cleanest fix is to upgrade to Python 3.9+ and TensorFlow 2.13 or later, which installs smoothly on macOS and is fully supported by LibRecommender 1.5.1
I wrote an MSc thesis a long time ago exactly about the question you asked, it is titled "Committed-Choice Programming Languages", you may find it helpful. Download link below:
https://1drv.ms/b/c/987ed5526a078e8f/EY-OB2pS1X4ggJiMEwAAAAABv08vK5GeA6Ci6F8IZ44wlA?e=GqRGM5
I thought Prolog and the 5th Generation Programming Project was dead a long time ago. I am bemused to see interest in this subject.
It seems like they offer the option if you do another API call to
It is possible to connect, like described here: https://blog.consol.de/software-engineering/ibm-mq-jmstoolbox/
The main parts are:
SET AUTHREC OBJTYPE(QMGR) PRINCIPAL('admin') AUTHADD(DSP, CONNECT, INQ)
SET AUTHREC PROFILE('SYSTEM.ADMIN.COMMAND.QUEUE') OBJTYPE(QUEUE) PRINCIPAL('app') AUTHADD(DSP, PUT, INQ)
SET AUTHREC PROFILE('SYSTEM.DEFAULT.MODEL.QUEUE') OBJTYPE(QUEUE) PRINCIPAL('app') AUTHADD(DSP, GET)
SET AUTHREC OBJTYPE(QMGR) PRINCIPAL('app') AUTHADD(DSP)
* Create a queue
DEFINE QLOCAL('MY.QUEUE.1') REPLACE
* Authorize app user
SET AUTHREC PROFILE('MY.QUEUE.1') OBJTYPE(QUEUE) PRINCIPAL('app') AUTHADD(BROWSE, GET, PUT, INQ)
Then connect with JMSToolBox:
If you want to connect using the app user you have to use the DEV.APP.SVRCONN channel
If you want to connect using the admin user you have to use the DEV.ADMIN.SVRCONN channel
For deployment: installing Rollup does not include react and react-dom. Rollup uses its own default React packages, so make sure to account for this in your setup.
For responsiveness in Compose, XML, and Kotlin/Java, I recommend this library:
No response since 11 years. Hope you got the answer at that time.
But for now.
Origin can;t be modified in your frontend code written in React/Angular etc.
But it can be changed via API clients like postman.
Use the "m"-Button on right side. In the appearing maven-Toolbar press the "Execute Maven Goal"-Button and doubleclick the "mvn install"-goal. The maven output will be printed on the left side in "run"-output panel.
Never mind. I just realized that this function doesn't need to be written into the component at all, since it doesn't depend on any component state—I can simply move it to the utils file.
If you want to convert datetime to timestamp and you are in a different timezone than UTC, you might want to look into the function CONVERT_TZ()
The only way out?
Change your package name and start a new app listing, like a phoenix reborn from (numeric) ashes.
Tailwind Cli tool is working for me in v4 or backward to v3 https://tailwindcss.com/docs/installation/tailwind-cli
what code on docker
RUN apt-get update && apt install nodejs npm -y
RUN npm install tailwindcss @tailwindcss/cli
npx @tailwindcss/cli -i ./src/site.css -o ./src/output.css --watch
Never reload. It is easy to reload by key shortcut if needed. Actually this dialog should be reduced in functionality to a notication that other programs are doing stuff to the file. Devs suggest you should use version control if needed
If you use Gitea:
git push origin test_branch:refs/for/development_branch -o topic="test"
If it’s only CSS, did you try applying a transparent cursor to both the html and body elements? Sometimes just targeting body isn’t enough, especially in fullscreen mode:
I did two steps, and it worked:
!pip install -U transformers huggingface_hub
from fpdf import FPDF
# Crea
| header 1 | header 2 |
|---|---|
| cell 1 | cell 2 |
| cell 3 | cell 4 |
te instance of FPDF class with UTF-8 support using DejaVu font
pdf = FPDF(format='A4')
pdf.add_page()
# Add DejaVu fonts for Unicode support
pdf.add_font('DejaVu', '', '/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf', uni=True)
pdf.add_font('DejaVu', 'B', '/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf', uni=True)
# Title
pdf.set_font('DejaVu', 'B', 18)
pdf.multi_cell(0, 10, "सच्ची दुनिया और सच्चा इंसान", align='C')
pdf.ln(5)
# Body
pdf.set_font('DejaVu', '', 12)
content = """दुनिया को बदलने से पहले, हमें खुद को समझना सीखना चाहिए।
अक्सर हम सोचते हैं कि दुनिया बुरी है, लोग गलत हैं, किस्मत साथ नहीं देती —
लेकिन सच्चाई यह है कि दुनिया वैसी ही होती है, जैसी हमारी सोच होती है।
अल्बर्ट आइंस्टीन ने कहा था —
“जीवन का असली मूल्य इस बात में है कि हम दूसरों के लिए क्या करते हैं।”
जब हम दूसरों की मदद करते हैं, जब किसी के चेहरे पर मुस्कान लाते हैं,
तो वहीं से हमारी असली सफलता शुरू होती है।
ज्ञान या पैसा बड़ा नहीं होता — बड़ी होती है इंसानियत।
महात्मा गांधी ने भी कहा —
“सत्य और अहिंसा ही सबसे बड़ी ताकत हैं।”
उन्होंने अपने जीवन से सिखाया कि सच्चाई पर टिके रहना कठिन जरूर है,
पर अंत में वही जीतता है।
जो खुद के अंदर की बुराइयों से लड़ता है, वही सच्चा विजेता होता है।
हम सब इस दुनिया को जानना चाहते हैं —
लेकिन असली समझ तब आती है, जब हम अपने मन की दुनिया को पहचानते हैं।
जब हम गुस्से की जगह धैर्य चुनते हैं,
नफरत की जगह प्यार, और डर की जगह विश्वास —
तभी हम दुनिया को वैसा देख पाते हैं, जैसी वो सच में है — सुंदर, सच्ची और अवसरों से भरी।
इसलिए याद रखिए —
दुनिया बदलने की शुरुआत “आप” से होती है।
अगर आप थोड़ा बेहतर इंसान बन जाएं,
तो आपकी वजह से दुनिया भी थोड़ी बेहतर हो जाएगी। 🌞"""
pdf.multi_cell(0, 8, content, align='J')
pdf.ln(10)
# Author name at the bottom right
pdf.set_font('DejaVu', '', 12)
pdf.cell(0, 10, 'लेखक: P.K. Yadav 720', 0, 0, 'R')
# Save PDF
file_path = "/mnt/data/Sacchi_Duniya_aur_Saccha_Insaan.pdf"
pdf.output(file_path)
file_path
If you are working with RN cli, try to use react-native-startup-splash, this library is built using turbo modules and supports both platforms.
I found out the following way to solve the issue
__files/search-response-e.json
{{#assign 'page-size'}}20{{/assign~}}
{{#each request.query}}
{{#if (eq @key 'size')}}
{{#assign 'page-size}}
{{this}}
{{/assign~}}
{{/if~}}
{{/each~}}
Iterating over the request.query content doesn't include the size property when it's not sent as a query parameter. So it is possible to iterate over all elements in request.query, check if any of the keys matches the parameter {{#if (eq @key 'size')}} and then assign the value replacing the default one if the parameter is present.
It is a solution but it's also very verbose and weird to understand what it's doing at first glance. I would appreciate it if anyone knows a better and cleaner way to solve this.
I'd like to know if there is a way to push an SBOM and then using Dependency track's API by getting the uuid or the url of the SBOM that was pushed automatically ?
Because after pushing the SBOM to my Dependency Track instance, and then asking for:
/api/v1/project/lookup?name=ECU3D06&version=0.0.1
I get the following response:
Access to the specified project is forbidden
probably because this SBOM is not added to my Portfolio Access Control team ... Is there a way to add it to the latter automatically in the last version of Dependency Track ?
This topic is first on google so : official answer from isotope devs : https://github.com/metafizzy/isotope/issues/1216, remove all "transition all" for isotope item as it messes with isotope inner class.
Applications can regain continuity and consistency by restoring past configurations, data, and session information from the database.
In How to control significant digits, ONLY when necessary, in a Thymeleaf template? they find a workaround for the case that it is an integer:
<span th:text="${user.averageScore} % 1 == 0? ${user.averageScore} :${#numbers.formatDecimal(user.averageScore, 0, 2)}"/>
Try to use react-native-startup-splash, this library is built using turbo modules and supports both platforms.
best secure platform for mode app download: You can find genuine mod APKs on websites such as ApkPure, APKMirror, and Uptodown. All three of these sites offer a wide selection of mod APKs for various apps and games. Additionally, you can also find mod APKs on XDA Developers, which is a great source for Android-related content.
Another option is to move your VAT calculation into a global snippet and include it in both product-template.liquid and cart-template.liquid. This ensures the same VAT-inclusive price appears site-wide.
In which world do you expect a mod app to be secure? It’s like trying to give birth in outer space.
transformToByteArray internally assumes Node-style buffers. Convert stream to ArrayBuffer safely in Deno using something like:
const byteArray = new Uint8Array(await new Response(response.Body).arrayBuffer());
Instead of using request.query.size directly, you need to access it using the lookup helper on the query parameters map:
{{#assign "page-size"}}
{{#if (lookup request.query "size")}}
{{lookup request.query "size"}}
{{else}}
20
{{/if}}
{{/assign}}
For Android UI design in Photoshop, it’s best to start with a base canvas size matching the density bucket you’re targeting like mdpi (baseline 160 dpi) for 320x480 pixels. Design your layout there at 72 dpi resolution in RGB mode. Then create scaled versions for hdpi, xhdpi, xxhdpi, etc., by multiplying the base size accordingly (e.g., 1.5x for hdpi). This approach helps keep your design sharp across different screen densities and sizes. Also, follow Material Design guidelines for consistent spacing and typography. Keep your layers organized for easier scaling and export.
Go to Build Phases and remove Info.plist if there is one inside "Copy Bundle Resources" as your porject knows this already exist!
According to https://www.jidesoft.com/history/index.php#3.7.4 the bug was fixed in this version
The correct term is a Power User Interface (or sometimes an Expert-Oriented Interface).
These interfaces are optimized for efficiency and speed, not for ease of learning. They assume users are already familiar with the system, allowing fast command entry and minimal visual overhead.
Examples include command-line tools, airline reservation terminals, and advanced editors like Vim or Emacs.
🔹 Note: This is not the same as an Expert System, which refers to an AI system that simulates human expertise in a specific domain.
The term you’re looking for is often called a “power user interface” or “expert interface.” These UIs are designed specifically for users who need speed and efficiency, often using shortcuts, commands, or minimal visuals to get things done faster like command-line tools or pro software. It’s different from general user-friendly interfaces meant for beginners.
Downgrading pylance worked for me, I downgraded it to "2024.12.1". I suspect this problem is caused by the server version being too old.
function display_class_category() {
global $post;
// Make sure we are using the correct post in the loop
setup_postdata( $post );
$target_categories = array( 'bread', 'cake', 'brownie' );
$categories = get_the_category( $post->ID );
$output = '';
if ( $categories ) {
foreach ( $categories as $category ) {
if ( in_array( $category->slug, $target_categories ) ) {
$category_link = get_category_link( $category->term_id );
$output .= '<div class="link-cat">
<a href="' . esc_url( $category_link ) . '">' . esc_html( $category->name ) . '</a>
</div>';
}
}
}
wp_reset_postdata();
return $output;
}
add_shortcode( 'class_category', 'display_class_category' );
setup_postdata( $post ) ensures that WordPress functions like has_category() or get_the_category() reference the current post in the loop — not a leftover global value.
No return inside the loop — so you can correctly build $output for each category.
wp_reset_postdata() cleans up after the shortcode so it doesn’t mess with the rest of the loop.
<?php while ( have_posts() ) : the_post(); ?>
<h2><?php the_title(); ?></h2>
[class_category]
<?php endwhile; ?>
Now each post in your archive should show the correct linked category (bread, cake, or brownie) according to its own category.
Would you like it to show only the first matching category, or all matching ones (if a post has multiple from that list)?
I faced the same issue. What I did was: Open Xcode → Settings → Components → Others, then install the required iOS simulators. After the installation, I got the simulator with Rosetta. Now I’m able to build the iOS app and run it on the Rosetta simulator without any issues.
enter image description hereenter image description here
I can use iOS 26 device for debugging after changing the flutter stable version to the master version, for complete details you can check here : https://github.com/flutter/flutter/issues/163984
Yes! I've been using sa-token-rust, which is a lightweight, high-performance authentication and authorization framework inspired by the popular Java sa-token library.
It provides everything you need in one cohesive framework:
✅ Complete authentication and authorization
✅ Multiple web framework support (Axum, Actix-web, Poem, Rocket, Warp)
✅ JWT with 8 algorithms (HS256/384/512, RS256/384/512, ES256/384)
✅ OAuth2 authorization code flow
✅ WebSocket authentication
✅ Real-time online user management and push notifications
✅ Distributed session for microservices
✅ Event listener system
✅ Security features (Nonce, Refresh Token)
✅ 7 token generation styles
✅ Production-ready with comprehensive tests
You state that you
need to install the source version of terra to use the INLA package (the binary version is not compatible).
That is almost certainly not true. Where did you get that idea?
If I do
install.packages("terra")
install.packages("INLA",repos=c(getOption("repos"),INLA="https://inla.r-inla-download.org/R/stable"), dep=TRUE)
All works as expected.
The reason that this is not working for you is that you are using an ancient version of Rtools and probably also of R.
using pysql instead of
mysql+mysqlconnector
I got around this by creating a meeting with the e-mail using respond with an event and then adding the event to one note. This way it works even with 5000 and above notes
I found that create-next-app installed Tailwind 3.x here in the fall of 2025, and as such, the steps above were helpful to upgrade to Tailwind 4.x. However, that alone was not sufficient.
My ultimate fix was to remove --turboback from this line in package.json:
"dev": "next dev",
Apparently, it skips Tailwind's preprocessing stages and doesn't handle @theme, @layer, or custom directives. And it injects only raw CSS imports.
I have created an iOS app in swiftUI to handle KML/KMZ files. They work swiftly with these files format and have created certain others features which might be useful during field survey. https://apps.apple.com/in/app/we-map/id6751641623
I found solution work for me. We have to handle inside iframe, not at parent website. The idea is: listen to wheel event of iframe document, if scroll position is top or bottom then preventDefault the wheel event. In my case, only div tag id message_list is overflow-auto, so I base on it to detect whether the scroll position is top or bottom. Hope this would help you all. My app is vue3 btw.
const handleWheel = (e) => {
try {
const el = document.getElementById('message_list');
if (!el) {
return;
}
const delta = e.deltaY;
const atTop = el.scrollTop === 0;
const atBottom = el.scrollTop + el.clientHeight >= el.scrollHeight;
if ((delta < 0 && atTop) || (delta > 0 && atBottom)) {
e.preventDefault();
}
} catch {
//
}
};
onMounted(() => {
document.addEventListener('wheel', handleWheel, {
passive: false,
});
});
onMounted(() => {
document.addEventListener('wheel', handleWheel, {
passive: false,
});
});
There are a few bugs in Swift 4.2.1 (2017) with Float80:
example: Float80(2.718281828459045312) truncates to Float64 then stores the inaccurate result in Float80 give the result: $R75: Float80 = 2.7182818284590450908.
Likewise let q:Float80 = Float80(2.718281828459045312) gives the result:q: Float80 = 2.7182818284590450908
The only way to get by that bug is by giving up on the Float80() method altogether: let q:Float80 = 2.718281828459045312 correctly gives the result: q: Float80 = 2.71828182845904531197
I had an underscore in my user name: user_q and that was making the making the rabbit upset.
Mustn't upset the rabbit
In my case, the issue was actually due to an incorrect keymap selected in the dropdown. You can also search for `Show Context Actions` to see what it is mapped to currently.
it's nice to meet you. I'm experiencing a similar issue. Have you found a solution yet?
in my case, It seems like the topology refresh is working correctly, but the connectionWatchdog keeps trying to reconnect with the old IP address.
My current settings are as follows:
enablePeriodicRefresh(true)
enableAllAdaptiveRefreshTriggers()
dynamicRefreshSources(true)
autoReconnect(true)
and, I am using DirContextDnsResolver for the dnsResolver
Perhaps try specifying which openssl you want to use during configure and make, i.e. to successfully install php from source on my mac I used:
# install missing requirements
brew install re2c libiconv pkg-config
# configure (you can change PATH for a single command without needing to change your ~/.bashrc or ~/.bash_profile)
PATH=/opt/homebrew/Cellar/bison/3.8.2/bin/:/opt/homebrew/opt/libiconv/bin/:/opt/homebrew/bin/:$PATH ./configure --with-openssl
# make the executable, then check you don't have any errors in the installation
PATH=/opt/homebrew/Cellar/bison/3.8.2/bin/:/opt/homebrew/opt/libiconv/bin/:/opt/homebrew/bin/:$PATH make
make test
# if there are no errors, install the software
make install
Does this approach solve your problem?
Environment variables like $POSTGRES_PASSWORD are only processed if the container initialization is running on an empty database directory; otherwise the users and passwords that were previously in the database state are kept, and no new users are created and no passwords are changed. You should clear the cache in the /data directory, I.e. the /data directory on the host; what is mounted on /var/lib/postgresql/data in the container.
PS: Thanks for @David Maze for posting the comment that finally worked as a correct answer! I’m not familiar with the community and can only pay tribute in this way :( If there’re proper ways to cite this, please inform me.
Try disabling Extensions and seeing of one of them is ruining your day. For me it was Atomineer.
I was able to approve tools by using the Agents interface within VS Code as described on this page: https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/develop/vs-code-agents-mcp
Never found a way to approve through the web UI.
I was unable to use the setup provided by @lightning_missile. Instead I manually wrapped the client like so:
class OpensearchAsyncClient:
def __init__(self, endpoint, region, access_key, secret_key, session_token):
self.endpoint = endpoint
self.signer = AWSV4Signer(
boto3.Session(
aws_access_key_id=access_key,
aws_secret_access_key=secret_key,
aws_session_token=session_token,
).get_credentials(),
region
)
self.session = aiohttp.ClientSession()
async def make_request(self, method, path, params, body: Dict[str, any], timeout=30):
url = f"https://{self.endpoint}/{path.lstrip('/')}"
url_encoded = str(URL(url).with_query(params))
body_str = json.dumps(body)
headers = self.signer.sign(
method,
url_encoded,
body_str,
{'Content-Type': 'application/json'},
)
async with self.session.request(
method, url,
params=params,
json=body,
headers=headers,
timeout=timeout
) as response:
if response.status != 200:
res_json = await response.json()
raise ValueError(f"Unable to make request: {str(response)}: {res_json}")
response = await response.json()
return response
Unfortunate because this prevents using opensearch-py directly. Including the answer because it does make use of the constructs provided by it, and there is likely a way to inject an Http client into opensearch-py so that it works but i cannot find a method at the moment
I was able to do it by using custom layers on Nivo
aca el simon dice que anda piola nada que ver lo que decis
The problem is not the python script. I run the code under two different version of the same IDE, one displays perfectly fine and the newer version is distorted.
I think I’ve figured out the reason: the cloud mask function removes many images with clouds, which reduces the amount of available data. The availability also varies across different regions depending on cloud conditions. The code works if we retrieve one image from every two months instead of one.
FIGURED IT OUT. indeed, it was the matter of adding the right binary to the correct package manually (it didnt work when i just tried to add it straight to '.', i had to manually add it to casadi package).
this was done by adding the following line to .spec file under Analysis config:
binaries=[('venv/Lib/site-packages/casadi/_casadi.pyd','casadi')]
(.pyd is like a DLL designed for python)
thanks to @furas for pointing this out!
I was having the same issue while using an JDK Alpine image in the Build stage, and solve it by changing it to a regular JDK image, I was able to generate my docker container even with the ${os.detected.classifier} in the POM.
from:
FROM maven:3.9.9-eclipse-temurin-21-alpine AS builder
to:
FROM maven:3.9.9-eclipse-temurin-21 AS builder
the same image that is used in the project repository:
https://github.com/chrisblakely01/java-spring-microservices/blob/main/billing-service/Dockerfile
for anyone interested this is the project course video
https://www.youtube.com/watch?v=tseqdcFfTUY by Chris Blakely,
awesome course.
How can I update the capacity of a finetuned GPT model on Azure using Python?
The code wasn't working due to a bug on MSFT side. They fixed the bug last week, and as a result updating the capacity of a finetuned GPT model on Azure using Python is now working.
Sometimes it's just the add-in buttons that don't appear. Try opening PowerPoint online. On the Home ribbon, click the Add-ins button. If your add-in is on the flyout that opens, select it and the buttons should appear.
When you specify a Content-Type of application/x-www-form-urlencoded, the XMLHttpRequest API is expecting that your post data be formatted just like query parameters. i.e.: ip=val1&ua=val2&country=val3. The following code should post to your script by first encoding the FormData entries into a URLSearchParams instance, then exporting that to a string.
var data = new FormData();
data.append('ip', 'val');
data.append('ua', 'val2');
data.append('country', 'val3');
var xhr = new XMLHttpRequest();
xhr.open('POST', 'visitor.php?type=visitorControl', true);
xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
var params = new URLSearchParams(data);
xhr.send(params.toString());
In addition to what @p-sampath mentioned, when working with Azure infrastructure, think about which layer to make changes in. If you have any infrastructure like Azure Application Gateway, Azure Front Door, or other reverse proxies/load balancers involved, it's likely they are the real and final server to the client. In that case, follow these steps [here](https://learn.microsoft.com/en-us/azure/application-gateway/hsts-http-headers-portal) to have your ApplicationGateway add a rewrite ruleset to inject the strict-transport-security header into the response for the https listener. Otherwise, any changes you make to the front end or the App Service settings directly don't even make it past the gateway to the client.
Finally found a way to exclude .js files from TypeScript checks.
You can add this rule in your settings.json file:
"js/ts.implicitProjectConfig.checkJs": false
Looking at the examples in the docs, it seems like you should omit the BearerToken in the Route environment type when you attach the middleware.
The warning appears in development (where dev keys are correct) but not in production (where dev keys are the actual problem). It only triggers when you're doing things right.
Also, if you make a warning that cannot be suppressed, people will just ignore it. Warnings need to be real, actionable issues—otherwise it's just noise that trains developers to tune out your messages.
Consider detecting the actual problem case (dev keys in production builds) rather than warning during normal development work.
This issue is really bugging me because I like a clean console!
Install the GraphQL app on your store and run your query. It's likely that you are exceeding the cost limit of the query and need to break it into smaller parts. The response comes back as empty with no error when I run it from Powershell, but in their app, it will actually tell why the query isn't working.
I have never seen an API silently fail like this. This has been maddening.
I just ran into a similar issue.
Instead of having something like this on your DTO:
#[Assert\Range(min: 0)]
#[Assert\NotNull()]
public float $price = null;
You can explicitly allow integers as well:
#[Assert\Range(min: 0)]
#[Assert\NotNull()]
public float|int|null $price = null;
Seems like the issue has been resolved in newer versions of React Native Executorch. You can give it a try, and if this won't work for you, please re-open the following issue on GitHub.
direction property<ScrollView horizontal style={{direction: 'rtl'}} contentContainerStyle={{flexDirection: 'row'}}>
<Item />
<Item />
<Item />
</ScrollView>
Give this a try:
from curl_cffi import requests
with requests.Session(impersonate="chrome") as session:
session.verify = False # Disable SSL verification
ticker = yf.Ticker(ticker=ticker_symbol, session=session)
This functionality has now been built into Positron: https://github.com/posit-dev/positron/pull/9324
Delete older version files from cache path. Worked for me.
Adding this as a help to anyone facing the same challenge:
WSO2 Identity Server supports Bcrypt from version 7.1.0 onwards. For IS 7.1.0, the support is provided through the Bcrypt Hash Provider Connector. You can find the configuration steps for this connector here: https://github.com/wso2-extensions/identity-hash-provider-bcrypt/blob/main/README.md.
Once the connector is set up, for this specific use case where the existing Bcrypt-hashed passwords need to be migrated, you can migrate the usernames and their corresponding password hashes directly into the WSO2 Identity Server user store schema using a carefully written database script. This way, users will be able to continue logging in with their existing passwords, and there’s no need to force a password reset flow.
If your migration scenario differs (eg: prompt users to reset their passwords), refer to the official WSO2 documentation for recommended migration approaches: https://is.docs.wso2.com/en/7.1.0/guides/users/migrate-users/
🗓️ হাঁস ও কয়েল পাখির ইনকিউবেশন ক্যালেন্ডার
তারিখ দিন কাজের বিবরণ হাঁসের অবস্থা কয়েলের অবস্থা
০৯ অক্টোবর দিন ১ হাঁসের ডিম ইনকিউবেটরে দিন শুরু —
১০–১৯ অক্টোবর দিন ২–১১ ডিম প্রতিদিন ৩–৪ বার ঘোরান, আর্দ্রতা ৫৫–৬০% রাখুন বিকাশ চলমান —
২০ অক্টোবর দিন ১২ (হাঁস) / দিন ১ (কয়েল) কয়েল পাখির ডিম ইনকিউবেটরে দিন বিকাশ চলমান শুরু
২১–২৭ অক্টোবর হাঁস দিন ১৩–১৯ / কয়েল দিন ২–৮ প্রতিদিন ডিম ঘোরান (দু’ধরনেরই) স্বাভাবিক বিকাশ বিকাশ শুরু
২৮ অক্টোবর–২ নভেম্বর হাঁস দিন ২০–২৫ / কয়েল দিন ৯–১৪ ডিম ঘোরানো অব্যাহত রাখুন ভ্রূণ সক্রিয় রক্তনালী গঠন
৩ নভেম্বর হাঁস দিন ২৬ / কয়েল দিন ১৫ কয়েলের জন্য ঘোরানো বন্ধ করুন, আর্দ্রতা ৭০% করুন ফাইনাল পর্যায় শেষ ধাপ
৪–৬ নভেম্বর হাঁস দিন ২৭–২৮ / কয়েল দিন ১৬–১৭ ইনকিউবেটর বন্ধ না করে রাখুন, পানি ট্রে পূর্ণ রাখুন বাচ্চা ফোটার সময় বাচ্চা ফোটার সময়
৬ নভেম্বর ২০২৫ — 🎉 হাঁস ও কয়েল দুটোই ফুটবে 🐣🦆 ফুটবে ফুটবে
import javax.cache.event.*;
import javax.cache.Cache;
public class JCachePutTraceListener
implements CacheEntryCreatedListener\<Object, Object\>,
CacheEntryUpdatedListener\<Object, Object\> {
@Override
public void onCreated(Iterable<CacheEntryEvent<?, ?>> events) throws CacheEntryListenerException {
for (CacheEntryEvent\<?, ?\> e : events) {
System.out.println("\[JCache\] CREATED key=" + e.getKey());
new RuntimeException("JCache PUT caller trace").printStackTrace();
}
}
@Override
public void onUpdated(Iterable<CacheEntryEvent<?, ?>> events) throws CacheEntryListenerException {
for (CacheEntryEvent\<?, ?\> e : events) {
System.out.println("\[JCache\] UPDATED key=" + e.getKey());
new RuntimeException("JCache PUT caller trace").printStackTrace();
}
}
}
import javax.cache.Cache;
import javax.cache.configuration.MutableCacheEntryListenerConfiguration;
import javax.cache.configuration.FactoryBuilder;
Cache<Object,Object> cache = cacheManager.getCache("yourCache");
MutableCacheEntryListenerConfiguration<Object,Object> cfg =
new MutableCacheEntryListenerConfiguration<>(
FactoryBuilder.factoryOf(JCachePutTraceListener.class),
null, // no filter
false, // old value not required
true // **synchronous** => runs on the caller thread
);
cache.registerCacheEntryListener(cfg);
from docx import Document
from docx.shared import Pt
from docx.enum.text import WD_ALIGN_PARAGRAPH
from fpdf import FPDF
# Create a Word document for the assignment
doc = Document()
# Title and header section
doc.add_heading('Assignment: Preparing an Effective Job Description for Customer Service Executives', level=1)
doc.add_paragraph("Name: Ramavatar Godara")
doc.add_paragraph("Subject: Human Resource Management")
doc.add_paragraph("College: JECRC University")
doc.add_paragraph("Date: 12-10-2025")
# Situation section
doc.add_heading("Situation:", level=2)
doc.add_paragraph(
"A fast-growing e-commerce company is hiring new customer service executives. "
"The management observes that many new employees leave within a few months, stating that "
"the actual work differs from what they expected during hiring."
)
# Question section
doc.add_heading("Question:", level=2)
doc.add_paragraph(
"As the HR Manager, how would you prepare a clear and effective Job Description for the customer service executive role "
"to avoid such mismatches? What key elements would you include in the job description, and why?"
)
# Answer section
doc.add_heading("Answer:", level=2)
doc.add_paragraph(
"As an HR Manager, preparing a clear and effective job description is essential to ensure that potential candidates have a "
"realistic understanding of the role. This helps to align expectations, improve employee satisfaction, and reduce early resignations."
)
doc.add_heading("Steps to Prepare a Clear Job Description:", level=3)
steps = [
"1. Job Analysis: Study the duties of existing executives and consult team leaders.",
"2. Define the Purpose of the Role: Explain why the role exists and how it contributes to company goals.",
"3. List of Key Responsibilities: Handle queries, maintain records, and meet performance targets.",
"4. Required Skills and Qualifications: Communication skills, computer literacy, and calmness under pressure.",
"5. Work Environment and Schedule: Mention shifts, night duties, or remote work details.",
"6. Performance Expectations: Define measurable targets and behavioral expectations.",
"7. Growth Opportunities: Include potential promotions and learning opportunities.",
"8. Compensation and Benefits: State salary range, incentives, and other perks."
]
for step in steps:
doc.add_paragraph(step, style='List Number')
# Key elements table
doc.add_heading("Key Elements Included in the Job Description and Why:", level=3)
table = doc.add_table(rows=1, cols=2)
hdr_cells = table.rows[0].cells
hdr_cells[0].text = 'Element'
hdr_cells[1].text = 'Purpose'
elements = [
("Job Title & Summary", "Provides a quick understanding of the role."),
("Duties & Responsibilities", "Clarifies what the employee will actually do."),
("Required Skills", "Ensures candidates assess their own suitability."),
("Work Conditions", "Prevents misunderstandings about shifts or work type."),
("Performance Metrics", "Sets clear expectations for success."),
("Growth & Benefits", "Motivates and retains employees.")
]
for element, purpose in elements:
row_cells = table.add_row().cells
row_cells[0].text = element
row_cells[1].text = purpose
# Conclusion section
doc.add_heading("Conclusion:", level=2)
doc.add_paragraph(
"A well-structured job description acts as a communication tool between HR and employees. "
"It ensures that candidates fully understand the nature of their job, leading to better job satisfaction, "
"reduced turnover, and improved organizational performance."
)
# Save as Word document
word_path = "/mnt/data/HR_Job_Description_Assignment.docx"
doc.save(word_path)
# Convert to PDF (simple typed style since true handwriting fonts require local font files)
pdf_path = "/mnt/data/HR_Job_Description_Assignment.pdf"
pdf = FPDF()
pdf.add_page()
pdf.set_font("Times", size=12)
pdf.multi_cell(0, 10, txt="""
Assignment: Preparing an Effective Job Description for Customer Service Executives
Name: Ramavatar Godara
Subject: Human Resource Management
College: JECRC University
Date: 12-10-2025
Situation:
A fast-growing e-commerce company is hiring new customer service executives. The management observes that many new employees leave within a few months, stating that the actual work differs from what they expected during hiring.
Question:
As the HR Manager, how would you prepare a clear and effective Job Description for the customer service executive role to avoid such mismatches? What key elements would you include in the job description, and why?
Answer:
As an HR Manager, preparing a clear and effective job description is essential to ensure that potential candidates have a realistic understanding of the role. This helps to align expectations, improve employee satisfaction, and reduce early resignations.
Steps to Prepare a Clear Job Description:
1. Job Analysis: Study the duties of existing executives and consult team leaders.
2. Define the Purpose of the Role: Explain why the role exists and how it contributes to company goals.
3. List of Key Responsibilities: Handle queries, maintain records, and meet performance targets.
4. Required Skills and Qualifications: Communication skills, computer literacy, and calmness under pressure.
5. Work Environment and Schedule: Mention shifts, night duties, or remote work details.
6. Performance Expectations: Define measurable targets and behavioral expectations.
7. Growth Opportunities: Include potential promotions and learning opportunities.
8. Compensation and Benefits: State salary range, incentives, and other perks.
Key Elements Included in the Job Description and Why:
- Job Title & Summary: Provides a quick understanding of the role.
- Duties & Responsibilities: Clarifies what the employee will actually do.
- Required Skills: Ensures candidates assess their own suitability.
- Work Conditions: Prevents misunderstandings about shifts or work type.
- Performance Metrics: Sets clear expectations for success.
- Growth & Benefits: Motivates and retains employees.
Conclusion:
A well-structured job description acts as a communication tool between HR and employees. It ensures that candidates fully understand the nature of their job, leading to better job satisfaction, reduced turnover, and improved organizational performance.
""")
pdf.output(pdf_path)
(word_path, pdf_path)
Adding onClick events on div's generally not recommended
try to add role="button" on the div which yo want to make as clickable.
func loginButtonClicked() {
let loginManager = LoginManager()
loginManager.logOut()
loginManager.logIn(permissions: [.email], viewController: nil) { (loginResult) in
switch loginResult {
case .success(let grantedPermissions, _, let token):
self.returnUserData()
print("Success",token,grantedPermissions)
break
case .cancelled:
print("Cancel")
break
case .failed(let error):
print(error.localizedDescription)
break
}
}
}
Concurrency: execution of progressive parts of different processes through switching, where only one part can execute at the same time.
Parallelism: execution of progressive parts of different processes through switching, where multiple parts execute at the same time.
For deploying Medusa v2’s admin panel on Render, you typically need to build the admin frontend separately since it’s a React app that outputs static files like index.html. Render needs to serve these built files, so you should run the build command (usually npm run build in the admin folder) and point Render to the build folder as the static site root. If you’re only deploying the Medusa backend on Render, then yes—you’ll usually deploy the admin frontend separately (for example, on Vercel) to properly serve the React app. Your current repo likely only contains the backend, so having a separate frontend repo and deployment is recommended.
Thank you @masoudiofficial your hint was enough for me to get to a workable version for my use case, which seems to be responsive too.
/* Position the tooltip */
position: fixed;
top: 40%;
left: 50%;
transform: translate(-50%, -5%);
z-index: 80;
I had an issue while importing modules with lambda layer.
I was using macos, each time when I wanted a compressed image, I used GUI. It was the culprit in my case. When I changed to CLI, it worked like charm.