Great code, vere is a github fork code, same interesting sings
I had same situation, you can take a look at my post Custom timer implemented as foreground service experiments delay after background.
I hope it helps at least others.
You need to follow the requirements listed below, based on the Spring Boot version you use. Solved my issue.
سلام،
به استک اورفلو خوش آمدید. پاسخ به سوال شما اینه که در نمودار تیر، np.ones_like برای ایجاد بردارهای dx و dy با طول یکسان استفاده میشه. این بردارها جهت و اندازه تیرها رو مشخص میکنن. از اونجایی که طول همه تیرها یکسانه، میتونیم تغییرات میدان برداری رو بهتر ببینیم. در واقع، np.ones_like به ما کمک میکنه تا اثر f رو روی جهت تیرها به طور واضحتری مشاهده کنیم.امیدوارم پاسخ مناسب سوال شما را داده باشم. باتشکر
To technically check the existence of an email address, you generally use two main methods: checking MX records and performing an SMTP handshake. For catch-all domains, there's a different challenge and special handling is needed.
Start by finding if the domain part of the email (after the "@") has mail servers configured via MX records. This is done using DNS queries:
Use a tool like nslookup, dig, or any DNS library to retrieve MX records for the domain.
If no MX records are found, the domain cannot receive mail and any address at this domain will not be valid.
Example (using command line):
text
nslookup -q=mx example.com
A response listing MX records confirms that the domain can receive emails.
Once MX records are confirmed, you can simulate the SMTP protocol to check recipient validity:
Connect to the mail server on port 25.
Initiate an SMTP transaction, stop just before sending any actual message.
Use the sequence: EHLO, MAIL FROM:, then RCPT TO: with the target email.
The response to RCPT TO: indicates if the mailbox exists:
250: Address is accepted (often means it exists)
550: Address does not exist
450/greylisting: Try later, or address is temporarily unavailable
Example interaction:
text
telnet mail.example.com 25 EHLO test.com MAIL FROM:<[email protected]> RCPT TO:<[email protected]> QUIT
Note: Some servers employ greylisting, tarpitting, or accept all addresses (catch-all), which can lead to false positives or delays.
With catch-all domains, the mail server accepts all RCPT TO requests, regardless of whether the mailbox actually exists. This means SMTP handshake alone can't determine if a specific email is valid:
A "catch-all" server always replies with 250 OK, even for fake addresses.
Advanced validation requires behavioral intelligence, sending patterns, or using probabilistic/risk scoring, sometimes combined with historical sending data.
Most simple verifiers will report these as "unknown" or "catch-all"; advanced commercial solutions may provide a risk assessment.
For thorough B2B or SaaS use-cases, combine MX checks, basic SMTP handshake, and catch-all/risk scoring for the best results. Consider privacy, rate limiting, and IP reputation (avoiding mass lookups from the same IP) to prevent server blocks.
For robust, developer-friendly email existence validation with detailed deliverability insights and catch-all detection, see Email Address Validation – SMTPing — a reliable solution for modern SaaS and sales teams.
File name for yaml: gooddisplay,gdey0154d67.yaml - GOOD
DRV_COMPAT: gooddisplay_gdey0154d67 - GOOD
compatible in yaml: compatible: "gooddisplay,gdey0154D67" - VERY BAD, see the upper case D third from the end.
There are other issues but I can figure that out now that my yaml is being picked in the build.
npm install -g eas-cli --force
import androidx.compose.foundation.layout.fillMaxSize you will also have to import this additionally
This should be comment but im unable to fit this message in comment size limit -w-
Are you watching entire address space or address region of ART's java heap? my speculation is your userfaultfd interfering with GC on ART (Android RunTime) VM. Android 13 replaces the GC with one using userfaultfd for improving various stuffs and it also got backported to Android 12.
Because i cant see the code, i dont know for sure. UFFDIO_REGISTER can return EBUSY if overlapped with other one watched by ART VM so if i is indeed ignored by your code it leads to that issue.
See https://en.wikipedia.org/wiki/Android_version_history#Android_13 and https://man7.org/linux/man-pages/man2/uffdio_register.2const.html#ERRORS
🎶✨ Arabic Singers & Songs List ✨🎶
🎤 Alshami – Bethoon
🎤 Ayman Amin – Enti w Bass
🎤 Mohamed Ramadan – Number One
🎤 Ahmad Saad – Wasaa Wasaa
🎤 Akhras – Skaba
🎤 Assala Nasri – Enta Souri Hor
🎤 Saif Nabil – Ashq Moot
you can use <blockquote>without copy button<blockquote> example
If the installation on the GUI doesn't work, you can try an other installation options as CLI or using direct upload !
*Installation options
Using the GUI: From your Jenkins dashboard navigate to Manage Jenkins > Manage Plugins and select the Available tab. Locate this plugin by searching for dependency-check-jenkins-plugin.
Using the CLI tool:
jenkins-plugin-cli --plugins dependency-check-jenkins-plugin:5.6.1
Using direct upload. Download one of the releases and upload it to your Jenkins controller.*
It seems that setting "rust-analyzer.check.workspace": false in .vscode/settings.json might do the trick.
It's not the v1beta endpoint issue, you are using a retired Gemini model (see the retired models), retired on 9/24/2025, for both Gemini 1.5 (pro and flash). That's the reason of getting 404 NOT FOUND error (in either Vertex AI or AI Studio). You should migrate to Gemini 2.0/2.5 Flash and later instead following the latest Gemini migration guide.
Use the best online tool to generate the secure password to use in any application.
https://spg-sans.netlify.app/
OK, just found the solution myself. As I was suspecting, it was hiding elsewhere in the options: Tools->Options->Text Editor->C/C++->Code Style->General->Comments->Continue single line comments = Uncheck. The same can be accomplished by higher-level Tools->Options->Text Editor->C/C++->Code Style->General->Comments->Insert existing comment style at the end of new lines when writing comments = Uncheck, but I suspect this will clip even more functionality, so the first option is minimally invasive.
It may be late but I got the same error (that's why I'm here ^^) and resolved it by doing the migration explained here :
It consists in removing
import io.flutter.plugin.common.PluginRegistry.Registrar
And use this instead
import io.flutter.embedding.engine.plugins.FlutterPlugin
For me, it works
Try using this instead:
--jar=file:///usr/lib/hadoop/hadoop-streaming.jar
In my case, I just needed to add the SHA1 key to Firebase from the Google Play console. You can find it at Test and release > App integrity > Play app signing > Upload key certificate
Just copy the keys and add it to your firebase> project settings> Your apps
Did You find a fix? I have the same problem now
I was facing the same issue. I was indeed also inside a numbered list, but I found a way to avoid this numbering to be applied to the code source I want to paste. Simply do a right-click where you plan to paste in Word, then select "Keep source formatting" paste option. Et voilà! :D
When you use "pushl" it substracts 4 bytes from your esp. The reason why you see 1 is because after the pushl was used, it moves the memory to other location so 0xffffcb60 is not storing the "85" and the value you can see is whatever was before calling pushl.
Try something like :
task.creator = user
task.save()
...and so on for an assignee and a verifier
This automatically manages the foreign key relationship properly.
colonel Thank you very mach
your are life saver
Awesome work thanks for sharing this info it really helped me a lot.
The requests library is used in combination with get function "requests.get(url)" for getting the raw HTML from the website that you are requesting. You can not get TFO data and JavaScript without using an emulator like "selenium"
Adding to Gregs answer, My solution was to modify the following two fields in the eclipse settings to get better scaling for text. This scaled enough text to make things bearable.
Tree and Table font for views
Part title font
OMG, new to this so i'm stupid, i have 3 'Release' the correct one is named 'Debug' 🤦♂️
So after i delete the other 2, and only Publish using Debug now the UI is correct
I found a workaround for JMeter latest version 5.6.3.
first i modified the report template to show all statistics and graphs in a single page then i modified the main bootstrap CSS file to make all elements printable friendly(avoid breaks/clutters).
finally i used the browser (CTRL+p) option to save as pdf. if you still interested i can share the modified template with you.
To enable Full-Text Search in scenarios where .NET Aspire acts as an orchestrator or development environment host, the key point is that FTS is a SQL Server engine-level feature and is directly related to the configuration of the SQL Server instance itself; therefore, the first requirement is to ensure that FTS is available and enabled on the instance to which the Aspire application connects. In practical terms, there are two main paths: either use an external SQL Server instance (VM, managed service, or physical server) on which FTS is installed and started, or the SQL Server instance should be run as an image/container that contains the FTS package (usually the package associated with mssql-server-fts). Once the capable engine is provided, full-text catalogs and indexes should be designed and created at the database level, and then text searches can be performed using the relevant operators and functions.
Infrastructure Prerequisites and Considerations: The user or team must have sufficient administrative access to the SQL Server instance or the ability to build and deploy a custom container image; must be familiar with the full-text structural requirements, including the need for a unique key index for the target table, language/LCID settings for text parsing, and managing stoplists and word breakers for proper behavior in the required languages. It is also essential to understand the limitations and implications of using SQL Server in a containerized environment: The official SQL Server on Linux image typically does not provide FTS by default, and if using a container, the FTS package must be added to the image or an external instance must be used. At the operational level, the necessary planning for data persistence, backup/recovery strategies, and HA/replication solutions in the production environment must also be done, as running SQL Server in a container for production requires careful consideration of I/O, monitoring, and maintenance strategies.
Practical solutions can be divided into two paths: the first path connects to an external SQL Server instance on which FTS is enabled; in this case, the responsibility for installing and maintaining FTS lies with the database management team, and the Aspire project simply connects to that instance. The second path is suitable for a local development environment and involves running SQL Server under Aspire control as a container; in this case, it is necessary to use an image that includes the FTS package or customize the official image by adding the FTS package. In both paths, after starting the engine, the catalog and full-text index definition operations must be performed, and the configurations related to language/stoplist and index population mode must be determined.
Implementation steps and issues to consider after provisioning the engine include: ensuring a suitable unique index exists as a technical key for the full-text index (Full-Text Index requires a single-column unique key), selecting or defining a full-text catalog and, if necessary, a custom stoplist, defining a full-text index on text columns by specifying the appropriate language/LCID for each column so that the corresponding word breaker and stemmer are applied, planning and monitoring the initial indexing process and subsequent population policies (full or incremental) and determining index update methods in response to data changes, and finally validating the performance and accuracy of results through CONTAINS/FREETEXT-based queries and reviewing the query execution plan to ensure proper use of full-text indexes.
Important technical and operational considerations to consider before implementation include: SQL Server version and distribution compatibility with the FTS package in the Linux/container environment (some version/distribution combinations require additional packages or special settings), the requirement to design the database schema to have a proper single-column unique key, configuring port mapping and network and firewall rules for access by management tools (especially in containerized scenarios where orchestration systems may dynamically map ports), capacity and monitoring considerations as full-text indexing can generate significant I/O and CPU load, and examining backup behavior and index rebuild time after restore to reduce downtime risks. Also, if non-English language support is required, appropriate language settings and word breakers for those languages need to be implemented and tested to ensure that search results are accurate and acceptable.
Common troubleshooting points to note include: Failure to detect FTS in a SQL instance is often due to using a container image that does not have the FTS package, and the first step is to check the contents of the packages/features installed on the engine; Unique key errors when defining a full-text index usually indicate the absence of a single-column unique key or the presence of null values in the key column; Poor performance or incomplete search results can be caused by incorrect language/stoplist configuration or incomplete index population; and connectivity issues from outside the container usually stem from port mappings, network rules, or permissions, and the network and orchestration configuration should be checked.
Implementation recommendations and summary: For a local development environment, the fastest and least cumbersome approach is to use a container image that includes mssql-server-fts and configure Aspire to run that image; In this case, be sure to emphasize data persistence (volume), fixed port mapping, and local backup policies so that the development team and management tools have reliable access. For production environments or enterprise-level team use, it is recommended to use managed instances or SQL Server virtual/physical machines that FTS
I found the main problem.
I start service from ApplicationMy class and got exception after reboot.
I remove the code to BroadCastRecieverBoot and it works!
Note. There is no different if lauch the app manually.
Bug is reported to Google.
On Slackware 15.0, the "Failed to connect to the bus..." messages got fixed by
export DBUS_SESSION_BUS_ADDRESS=unix:path=/var/run/dbus/system_bus_socket
Google Chrome";v="141", "Not?A_Brand";v="8", "Chromium";v="141"
SEC-CH-UA-MOBILE ?1
Funny that the answer really is this stupid easy,
after defining build.rs in our project directory:
use std::{env, path::PathBuf};
fn main() {
// Use cortex-m-rt's linker script
println!("cargo:rustc-link-arg=-Tlink.x");
// Make the *containing directory* of memory.x visible to the linker
let manifest_dir = PathBuf::from(env::var("CARGO_MANIFEST_DIR").unwrap());
println!("cargo:rustc-link-search={}", manifest_dir.display());
// Rebuild if memory.x changes
println!(
"cargo:rerun-if-changed={}",
manifest_dir.join("memory.x").display()
);
}
we got this error showing the root of problem once the memory.x script was finally being used:
memory.x:5: redefinition of memory region 'FLASH'
\>\>\> REGION_ALIAS(FLASH, FLASH); \>\>\>
^ error: could not compile stm32_firmware (bin "stm32_firmware") due to 1 previous error
so quite literally just needed to remove the REGION_ALIAS(FLASH, FLASH); line and the binary is looking scrumptious now
@thebusybee thanks to him for letting me know verbose still exists lol. using the verbose appendage I confirmed the linker flags are being appended to the build commands defined in the .config file ruling that out, revealing the memory.x wasn't being used in the first place
If your plan is use maps in windows desktop just for get it!! microsoft again close all the doors to do an easy implementation, is incredible!!! flutter is a good solution!
Spring Boot 3.5 introduced internal changes to its Binder.
You can not modify queryParameters because is an immutable list.
try:
@Getter
@Setter
public class ApiConfig {
private List<String> queryParameters = new ArrayList<>();
}
Of coz it's possible by formula
The root cause of the issue was that the storage was full. I was able to see that clearly in the AWS RDS console.
Sharing here because there are very little occurrences of this error online and none of them led me to the correct root cause in my case.
To prevent vertical scrolling you should add "overflow-y: hidden;"
Try something like this:
.navbar {
overflow-x: auto;
overflow-y: hidden;
white-space: nowrap;
}
Still struggling with this issue
cyrillic_font = TrueTypeFont("DejaVuSans.ttf") - doesn't work
AttributeError: type object 'TrueTypeFont' has no attribute 'true_type_font_from_file' - the mistake when I try to use true_type_font_from_file
Please help!
PowerShell treats $myargs as one single argument, while tasklist expects /FI and its filter expression to be separate arguments.
Try passing an array of parameters:
$myfilter = "foo"
$myargs = @("/FI", $myfilter)
tasklist @myargs
At The ABClinic, we combine modern science with a human-centered approach. Every therapy plan begins with a detailed evaluation to identify the client’s unique strengths and challenges. From there, our speech-language pathologists design a personalized program using evidence-based techniques proven to achieve real results.
We emphasize a family-centered model, encouraging parents and caregivers to take an active role in therapy. This collaboration ensures that progress made in the clinic continues at home, school, and in the community.
Our treatment philosophy centers on three key pillars:
Empathy: We listen and understand before we guide.enter image description here
Evidence: Every therapy plan is grounded in the latest research and proven clinical practices.
Empowerment: We build confidence through consistent progress and encouragement.
Over the years, The ABClinic has built a strong reputation across Oregon for professional excellence and heartfelt care. Our clients choose us not only for our clinical expertise but also for the warm, supportive environment we provide.
Here’s what sets us apart:
Experienced Speech-Language Pathologists: Each therapist brings specialized knowledge, advanced training, and years of clinical experience.
Customized Therapy Plans: Every individual receives a tailored program based on their specific communication goals.
Modern Tools and Technology: We use the latest assessment software, interactive digital therapy aids, and progress-tracking systems.
Family-Centered Approach: We educate and empower parents to participate actively in their child’s speech journey.
Inclusive and Accessible Services: We welcome clients from all backgrounds and provide flexible scheduling for busy families.
Whether your goal is to help your child say their first words, overcome stuttering, or restore speech after a stroke, The ABClinic provides the expertise and compassion you need to move forward.
Every day at The ABClinic, we witness remarkable transformations. A child who once struggled to express themselves can now tell stories with confidence. A professional who feared public speaking can now communicate clearly and effectively. A stroke survivor can once again speak with loved ones.
These stories remind us why we do what we do — because communication is not just about words; it’s about connection, understanding, and confidence.
Located in Sherwood, Oregon, The ABClinic proudly serves clients from surrounding communities, including Portland, Tigard, Tualatin, and Newberg. We are deeply committed to making speech and language therapy accessible to everyone who needs it, regardless of age or background.
Through school collaborations, public awareness programs, and family education, The ABClinic continues to raise awareness about the importance of early intervention and ongoing communication support.
As technology advances, The ABClinic remains at the forefront of innovation. We continuously integrate digital learning tools, teletherapy options, and modern assessment techniques to make therapy more engaging and effective. Our team’s dedication to professional growth ensures that every client receives the highest standard of care based on the latest clinical research.
We believe that communication is a lifelong journey — and we are honored to walk that path alongside our clients.
The ABClinic is more than a speech and language center — it’s a place of hope, progress, and empowerment. By combining science with compassion, we help individuals of all ages unlock their communication potential and achieve greater confidence in every aspect of life.
Whether you’re seeking pediatric speech therapy, adult language support, or autism communication programs, The ABClinic is here to help you speak, connect, and thrive.
Root Node
Definition: The topmost node of the tree.
Function: It represents the entire dataset and is the first point where the data is split based on the most significant feature.
Example: If you're predicting customer churn, the root node might split on "Contract Type" — the feature that best separates churn vs. non-churn.
Internal Node
Definition: Any node that is not the root or a leaf.
Function: It continues splitting the data based on other features, refining the decision path.
Example: After splitting on "Contract Type", an internal node might split on "Monthly Charges" or "Tenure".
This solve the issue:
https://developer.apple.com/forums/thread/737894
note that it involve 3 steps:
create archive(not build)
Do the steps in the link
Manually notarize
All this steps are in the link bellow.
Interesting challenge. The following should do it. I don't use Trino myself so is not tested.
SELECT _.ID FROM
(
SELECT
ID,
reverse(split(URL, "/")) as Elements
) AS _
WHERE cardinality(_.Elements) >= 2 && _.Elements[0] = _.Elements[1]
cd /var/log
echo "start now" > youraccount.pythonanywhere.com.access.log
echo "start now" > youraccount.pythonanywhere.com.error.log
echo "start now" > youraccount.pythonanywhere.com.server.log
You can put all that in a file. Copy those 4 lines
cd $HOME
nano clearlogs.sh
paste
ctrl-0
ctrl-x
chmod +x clearlogs.sh
To do it:
./clearlogs.sh
random_num = np.random.normal(size=4)
coff = np.round(random_num, decimals=2)
This snippet generates 4 random numbers and round it off upto 2 decimal places.
You might want to check with your proxy provider and ask for detailed usage logs — they usually record the client IPs.
For example, KindProxy logs connection IPs, which can help you verify that only your own servers are being used.
If you see IPs that aren’t from your AWS EC2 instances, it may indicate that your account credentials have been compromised.
Set the Python path inside Apache’s WSGI configuration, e.g.:
WSGIPythonPath /Users/<MY_NAME_>/PATH_TO/py3
or in your <VirtualHost>:
WSGIApplicationGroup %{GLOBAL}
WSGIDaemonProcess myapp python-path=/Users/<MY_NAME_>/PATH_TO/py3
WSGIProcessGroup myapp
This is not a valid c code. in c you pass paramaters as value, not reference.
equivalent of this is
void row_del(int ***a, int *row, int col, int k);
Introduced in PowerShell 7.4:
Start-Process -Environment @{ foo = ‘bar’ } app
you may check it... works for me
import pandas as pd # ファイル名 file_name = "20251026142556231_入出庫明細.csv" # データの読み込み(日付列を日付型として読み込む) df = pd.read_csv(file_name, parse_dates=['日付']) # 処理対象期間の設定 start_date = pd.to_datetime('2025-10-01') end_date = pd.to_datetime('2025-10-25') # フィルタリング # 1. 入出庫区分が「出庫」 df_out = df[df['入出庫区分'] == '出庫'].copy() # 2. 日付が10/01から10/25の範囲 df_filtered = df_out[(df_out['日付'] >= start_date) & (df_out['日付'] <= end_date)].copy() # '出庫数'列が欠損値の場合は0として扱う df_filtered['出庫数'] = df_filtered['出庫数'].fillna(0) # 集計: 商品コード、商品名、ロケーションごとの出庫数合計 df_grouped = df_filtered.groupby(['商品コード', '商品名', 'ロケーション'], dropna=False)['出庫数'].sum().reset_index() # 列名の変更 df_grouped.rename(columns={'出庫数': '合計出荷数'}, inplace=True) # 出荷数が多い順にソート (全件を対象) df_sorted = df_grouped.sort_values(by='合計出荷数', ascending=False) # 必要な列の選択 df_result = df_sorted[['商品コード', '商品名', 'ロケーション', '合計出荷数']] # 結果をExcelファイルに出力 output_file = "出荷量順_全商品集計_1001_1025.xlsx" df_result.to_excel(output_file, index=False, sheet_name='全商品') print(f"処理が完了しました。結果は '{output_file}' に保存されました。")
Here’s a working example that uses authentication directly in the proxy URL:
import requests
import json
username = "account"
password = "password"
proxies = {
"http": f"http://{username}:{password}@gw.kindproxy.com:12000",
"https": f"http://{username}:{password}@gw.kindproxy.com:12000",
}
def fetch(url):
r = requests.get(url, proxies=proxies, timeout=30)
try:
data = r.json()
print(json.dumps(data, indent=4))
except Exception:
print(r.text)
fetch("http://ipinfo.io")
This format works for both HTTP and SOCKS5 proxies (just change the protocol if needed).
It’s simple and doesn’t require extra authentication objects.
For more practical examples — including Python, Node.js, and curl — see:
kindproxy dot com / faq / code-examples
Are you using any automation to start the tab recording? As streamId is only generated when there is a User interaction.
TL;DR: https://github.com/nodejs/node/
I would say there are 3 main components in the concept:
V8 Engine
libuv
nodejs core (I named it as orchestrator)
When you running cmd like: node index.js what happens?
First, how do you use node command? There is a daemon server (server run in background) provide CLI right - this is node daemon server.
Node run StartExecution entry and doing stuff:
Nodejs start V8 sandbox with a Just In Time (JIT) Compiler - incharge of compile js code to machine code based on ECMAScript and WebAssembly standard.
Node start internal binding - wrap external dependecies call to C libraries like file system, stream, ... to Node API like node:fs, node:stream, ...
Node start libuv event loop (THIS IS WHAT YOU USALLY HEARD) to manage asynchronous tasks. We need to pay attention to:
Worker Thread Pool
Callback Queue
Okay, finish the startup, now the entry file is index.js, node will say: "Hey V8, please run this JS code"
V8 compile JIT and run code line by line, if it s just native JS code like: const a = 1 + 2 will be execute directly and for function call like function sum() {} it wil put into call stack to execute.
For async tasks (micro tasks, macro tasks) like: Promise, timers, ... Now is the heart of nodejs where the game start.
When you try to run something like: fs.readFile('./file.pdf', 'utf-8', (content) => {...});
Remember when I said Node do internal binding , node delegate the call to external deps of C library AND managed by Async Tasks Manager of Libuv.
At the Nodejs API level, it verifies parameters like path, Encoding, ... data types, and now it call to the binding of nodejs core API, it register to the QUEUE:
the task
the callback
Worker Thread from libuv pickup the task, execute the call using C library, locking the thread resource if it is I/O task, if OS task, it handover to OS Async
When task finished, it put the result into the CALLBACK QUEUE
The libuv Event Loop see that we have the result, let s trigger the callback above, this callback is actually a C pointer reference to the callback in JS code that we registered.
At the end, the JS code have no code to execute anymore, UV Loop Empty, no tasks left, so nodejs will start ending the running process and emit the signal to system to stop this.
WSL doesn't expose /dev/dri. it only expose /dev/dxg which is d3d12. traditional vulkan drivers looks for GPU hardware(/dev/dri) and so that reason why it fall back to llvmpipe. you can try mesa d3d12 from source. which may help you.
Perfect its working, thak you for the hint
import "dotenv/config"; to prisma.config.ts, I hope this helps:)Action: file_editor view /app/backend/server.py
Observation: /app/backend/server.py:
1|from fastapi import FastAPI, APIRouter
2|from dotenv import load_dotenv
3|from starlette.middleware.cors import CORSMiddleware
4|from motor.motor_asyncio import AsyncIOMotorClient
5|import os
6|import logging
7|from pathlib import Path
8|from pydantic import BaseModel, Field, ConfigDict
9|from typing import List
10|import uuid
11|from datetime import datetime, timezone
12|
13|
14|ROOT_DIR = Path(_file_).parent
15|load_dotenv(ROOT_DIR / '.env')
16|
17|# MongoDB connection
18|mongo_url = os.environ['MONGO_URL']
19|client = AsyncIOMotorClient(mongo_url)
20|db = client[os.environ['DB_NAME']]
21|
22|# Create the main app without a prefix
23|app = FastAPI()
24|
25|# Create a router with the /api prefix
26|api_router = APIRouter(prefix="/api")
27|
28|
29|# Define Models
30|class StatusCheck(BaseModel):
31| model_config = ConfigDict(extra="ignore") # Ignore MongoDB's _id field
32|
33| id: str = Field(default_factory=lambda: str(uuid.uuid4()))
34| client_name: str
35| timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
36|
37|class StatusCheckCreate(BaseModel):
38| client_name: str
39|
40|# Add your routes to the router instead of directly to app
41|@api_router.get("/")
42|async def root():
43| return {"message": "Hello World"}
44|
45|@api_router.post("/status", response_model=StatusCheck)
46|async def create_status_check(input: StatusCheckCreate):
47| status_dict = input.model_dump()
48| status_obj = StatusCheck(**status_dict)
49|
50| # Convert to dict and serialize datetime to ISO string for MongoDB
51| doc = status_obj.model_dump()
52| doc['timestamp'] = doc['timestamp'].isoformat()
53|
54| _ = await db.status_checks.insert_one(doc)
55| return status_obj
56|
57|@api_router.get("/status", response_model=List[StatusCheck])
58|async def get_status_checks():
59| # Exclude MongoDB's _id field from the query results
60| status_checks = await db.status_checks.find({}, {"_id": 0}).to_list(1000)
61|
62| # Convert ISO string timestamps back to datetime objects
63| for check in status_checks:
64| if isinstance(check['timestamp'], str):
65| check['timestamp'] = datetime.fromisoformat(check['timestamp'])
66|
67| return status_checks
68|
69|# Include the router in the main app
70|app.include_router(api_router)
71|
72|app.add_middleware(
73| CORSMiddleware,
74| allow_credentials=True,
75| allow_origins=os.environ.get('CORS_ORIGINS', '*').split(','),
76| allow_methods=["*"],
77| allow_headers=["*"],
78|)
79|
80|# Configure logging
81|logging.basicConfig(
82| level=logging.INFO,
83| format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
84|)
85|logger = logging.getLogger(_name_)
86|
Action: file_editor view /app/backend/server.py
Observation: /app/backend/server.py:
1|from fastapi import FastAPI, APIRouter
2|from dotenv import load_dotenv
3|from starlette.middleware.cors import CORSMiddleware
4|from motor.motor_asyncio import AsyncIOMotorClient
5|import os
6|import logging
7|from pathlib import Path
8|from pydantic import BaseModel, Field, ConfigDict
9|from typing import List
10|import uuid
11|from datetime import datetime, timezone
12|
13|
14|ROOT_DIR = Path(_file_).parent
15|load_dotenv(ROOT_DIR / '.env')
16|
17|# MongoDB connection
18|mongo_url = os.environ['MONGO_URL']
19|client = AsyncIOMotorClient(mongo_url)
20|db = client[os.environ['DB_NAME']]
21|
22|# Create the main app without a prefix
23|app = FastAPI()
24|
25|# Create a router with the /api prefix
26|api_router = APIRouter(prefix="/api")
27|
28|
29|# Define Models
30|class StatusCheck(BaseModel):
31| model_config = ConfigDict(extra="ignore") # Ignore MongoDB's _id field
32|
33| id: str = Field(default_factory=lambda: str(uuid.uuid4()))
34| client_name: str
35| timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
36|
37|class StatusCheckCreate(BaseModel):
38| client_name: str
39|
40|# Add your routes to the router instead of directly to app
41|@api_router.get("/")
42|async def root():
43| return {"message": "Hello World"}
44|
45|@api_router.post("/status", response_model=StatusCheck)
46|async def create_status_check(input: StatusCheckCreate):
47| status_dict = input.model_dump()
48| status_obj = StatusCheck(**status_dict)
49|
50| # Convert to dict and serialize datetime to ISO string for MongoDB
51| doc = status_obj.model_dump()
52| doc['timestamp'] = doc['timestamp'].isoformat()
53|
54| _ = await db.status_checks.insert_one(doc)
55| return status_obj
56|
57|@api_router.get("/status", response_model=List[StatusCheck])
58|async def get_status_checks():
59| # Exclude MongoDB's _id field from the query results
60| status_checks = await db.status_checks.find({}, {"_id": 0}).to_list(1000)
61|
62| # Convert ISO string timestamps back to datetime objects
63| for check in status_checks:
64| if isinstance(check['timestamp'], str):
65| check['timestamp'] = datetime.fromisoformat(check['timestamp'])
66|
67| return status_checks
68|
69|# Include the router in the main app
70|app.include_router(api_router)
71|
72|app.add_middleware(
73| CORSMiddleware,
74| allow_credentials=True,
75| allow_origins=os.environ.get('CORS_ORIGINS', '*').split(','),
76| allow_methods=["*"],
77| allow_headers=["*"],
78|)
79|
80|# Configure logging
81|logging.basicConfig(
82| level=logging.INFO,
83| format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
84|)
85|logger = logging.getLogger(_name_)
86|
87|@app.on_event("shutdown")
88|async def shutdown_db_client():
89| client.close()
87|@app.on_event("shutdown")
88|async def shutdown_db_client():
89| client.close()
I'm having the same exact issue. Any solution?
With Swift 6 this seems to be problematic even as Codable, even with simples structs, and unless being an optional gives warning. Is there solution to this?
bli ya base and move on paling gampang iku kang ws teka teki silang yang punya cewe yang ga bisa
Check your camera permissions for the file you are trying to use the camera with. I tested it myself, and it worked fine.
Using AirDrop installs it right away!
I got similar error in Zend framework and i noticed that if i change the module name to be something different than 'public' it works. Seems "Public" is reserved word in PHP and can't be used for namespaces. Hope this helps
You can try
nodemon ./index.js
Or
nodemon index
For more specific information, you can see this documentation
Finally... Hola finalmente como solucionaste el error con el paquete ibm_bd .. yo sigo con el problema.. no lo he podido solucionar... Cuales fueron los pasos correctos que solucionaron tu problema con ibm_db
Use react-native-maps 1.20.1 at least or older to get oldArch support. That will fix your issue.
from reportlab.lib.pagesizes import landscape, A3
from reportlab.pdfgen import canvas
from reportlab.lib.units import mm
# Percorso di salvataggio del PDF
pdf_path = "Tavole_4e5_Tipografia.pdf"
c = canvas.Canvas(pdf_path, pagesize=landscape(A3))
# Funzione per disegnare i margini
def draw_margins(x0, y0, width, height, margin):
c.setStrokeColorRGB(0.7, 0.7, 0.7)
c.rect(x0 + margin, y0 + margin, width - 2\*margin, height - 2\*margin)
# Parametri generali
page_w, page_h = landscape(A3)
margin = 20 * mm
# ------------------ TAVOLA 4 ------------------
c.setFont("Helvetica-Bold", 14)
c.drawString(40, page_h - 40, "TAVOLA 4 — GIUSTEZZA (COMPOSIZIONE GIUSTIFICATA)")
draw_margins(0, 0, page_w, page_h, margin)
# Colonne
col_width = (page_w - 2*margin - 20*mm) / 2
y_top = page_h - 80*mm
text_height = 60*mm
# Box sinistra
c.setStrokeColorRGB(0.5, 0.5, 0.5)
c.rect(margin, y_top, col_width, text_height)
c.setFont("Helvetica-Bold", 10)
c.drawString(margin, y_top + text_height + 10, "GIUSTIFICATO CON SILLABAZIONE")
c.setFont("Helvetica", 9)
c.drawString(margin, y_top - 15, "Con la sillabazione attiva, il margine destro è regolare e la lettura risulta fluida.")
# Box destra
x_right = margin + col_width + 20*mm
c.rect(x_right, y_top, col_width, text_height)
c.setFont("Helvetica-Bold", 10)
c.drawString(x_right, y_top + text_height + 10, "GIUSTIFICATO SENZA SILLABAZIONE")
c.setFont("Helvetica", 9)
c.drawString(x_right, y_top - 15, "Senza sillabazione, la spaziatura irregolare crea 'fiumi bianchi' e affatica la lettura.")
# Cartiglio
cart_h = 20*mm
c.setStrokeColorRGB(0.6, 0.6, 0.6)
c.rect(page_w - 100*mm, margin, 90*mm, cart_h)
c.setFont("Helvetica", 8)
c.drawString(page_w - 98*mm, margin + cart_h - 10, "TITOLO: GIUSTEZZA (COMPOSIZIONE GIUSTIFICATA)")
c.drawString(page_w - 98*mm, margin + cart_h - 20, "NOME: __________________ MATERIA: COMPETENZE GRAFICHE DATA: __/__/__")
c.showPage()
# ------------------ TAVOLA 5 ------------------
c.setFont("Helvetica-Bold", 14)
c.drawString(40, page_h - 40, "TAVOLA 5 — ALLINEAMENTO E SILLABAZIONE")
draw_margins(0, 0, page_w, page_h, margin)
# Colonne
col_w = (page_w - 2*margin - 3*15*mm) / 4
y_top = page_h - 80*mm
text_h = 60*mm
alignments = [
"A SINISTRA (BANDIERA DESTRA)",
"A DESTRA (BANDIERA SINISTRA)",
"CENTRATO",
"GIUSTIFICATO (SENZA SILLABAZIONE)"
]
comments = [
"Ideale per testi lunghi come romanzi o articoli.",
"Usato per brevi testi o note a margine.",
"Adatto a titoli o testi brevi.",
"Uniforma i margini ma può generare spazi irregolari."
]
x = margin
for i in range(4):
c.setStrokeColorRGB(0.5, 0.5, 0.5)
c.rect(x, y_top, col_w, text_h)
c.setFont("Helvetica-Bold", 9)
c.drawString(x, y_top + text_h + 10, alignments\[i\])
c.setFont("Helvetica", 8.5)
c.drawString(x, y_top - 15, comments\[i\])
x += col_w + 15\*mm
# Cartiglio
c.setStrokeColorRGB(0.6, 0.6, 0.6)
c.rect(page_w - 100*mm, margin, 90*mm, cart_h)
c.setFont("Helvetica", 8)
c.drawString(page_w - 98*mm, margin + cart_h - 10, "TITOLO: ALLINEAMENTO E SILLABAZIONE")
c.drawString(page_w - 98*mm, margin + cart_h - 20, "NOME: ____________________ MATER
IA: COMPETENZE GRAFICHE DATA: __/__/____")
c.save()
print(f"PDF generato: {pdf_path}")
Solution is to set the language standard to C++ 20 as described here:
https://www.learncpp.com/cpp-tutorial/configuring-your-compiler-choosing-a-language-standard/
Then go to Project -> Export template. Then use the new template next time.
So, open ToolBox and right click on the empty any empty area, for instance, on "General", choose ChooseItems then after loading press Browse and go to your solution folder right to the "packages" folder. Here is an example of the path:
"...Application\packages\Microsoft.ReportingServices.ReportViewerControl.Winforms.150.1652.0\lib\net40". After you reach it find file named "Microsoft.ReportViewer.WinForms.dll" and choose it. So, here we are! All that we should do is to type "ReportViewer" in the search bar of the toolbox and drag it :)
p.s. This solution's considering that package is installed per se
Go to the Console tab
Look for errors mentioning CSS or 404s
See if CSS files are loading (look for red/failed requests)
This will show you the exact path Elgg is trying to load CSS from
I have a faster floating-point number printing algorithm.
https://onlinegdb.com/OPKdOpikG
You can customize the default model binding error globally
builder.Services.AddControllersWithViews()
.AddMvcOptions(options =>
{
options.ModelBindingMessageProvider.SetValueMustBeANumberAccessor((value, fieldName) =>
$"Please enter a valid number for {fieldName}.");
});
I recently encountered this problem, but I did not install Gauge and AWS Toolkit. Editor > General > Smart Keys Close block comment has been closed, but the problem still exists.
Apparently, there was an extra slash as suggest by Drew Reese in comments, also I need to move ThemeProvider out of scope of BrowserRouter, I think BrowserRouter need to reside just above the app like this -
import { createRoot } from "react-dom/client"
import "./index.css"
import App from "./App"
import { Toaster } from "sonner"
import { ThemeProvider } from "./components/theme-provider"
import { BrowserRouter } from "react-router-dom"
createRoot(document.getElementById("root")!).render(
<ThemeProvider attribute="class" defaultTheme="system" enableSystem>
<BrowserRouter basename="/unidash">
<App />
</BrowserRouter>
<Toaster richColors position="top-center" />
</ThemeProvider>
)
I ended up usng revenuecat and it works great
hello
1 - The space that you are seeing printed on your console is because you have put multiple variables in console.log() so the white space is default js behaviour to make the code be more readable. Elsewhere you might be putting values directly instead of variable declarations.
2 - making console.log(foo, bar) will not do anything extra because again like said in point 1. those variable declarations.
3 - console.log(foo," ",bar); this will create white space because you have "Foo"," ","Bar" notice the whitespace is btw inverted commas so it is String value hence you have put three values in console so to speak which means. "Foo Bar" is your output. :D
Hello, I have two questions for you.
I'm using version v1.6 of your code, so I should use:
CALL :RequestAdminElevation "%~dpfs0" %* || goto:eof
or
CALL :RequestAdminElevation "%~dpf0" %* || goto:eof
Which is better? "%~dpfs0" or "%~dpf0" ???
The second question is (see the attached screenshot).
When I run a batch code file converted from .cmd to .exe, for example, TEST.exe, with elevated administrator privileges on a shared folder on a Windows 10 x64 system in VirtualBox, I get a path error.
When I run the same file without administrator privileges, it runs correctly, but with elevated administrator privileges, it doesn't and displays a path error.
I need to copy the file from the shared folder to the desktop in VirtualBox and run it from there – then it works properly with elevated administrator privileges.
Is there any way to get the permissions and current path also work when I run TEST.exe directly through VirtualBox from the shared folder on my PC?
When I run TEST.cmd itself before converting it to .exe format, I can run it directly from the shared folder in VirtualBox without copying it to the desktop, and it also runs with elevated administrator privileges.
The problem only occurs when I run it in .exe format and directly from the shared folder in VirtualBox without moving the file.
I managed to figure it out with some help from the GNOME Discourse page- see this post for details.
package com.example.ghostapp
import androidx.appcompat.app.AppCompatActivity
import android.os.Bundle
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
}
}
There are a lot of working solutions here, but here is the fastest one in terms of performance:
function table.append(t1, t2)
local n = #t1
for i = 1, #t2 do
t1[n + i] = t2[i]
end
return t1
end
Concatenates in place Table t2 at the end of Table t1.
You can skip requirments.txt entirely and just use uv:
uv init
uv pip install pandas numpy sci-kit learn
uv sync
uv automatically builds and manages the environment for you.
The problem was with the ids repeating across several batches, which resulted in overwriting a part of the data, which had been previosly loaded
It does look like a change in behaviour - but since it is undefined to start with, it should not be relied on. Hopefully if anyone has been relying on this and it's worked hopefully this question is there as a warning.
You can find the check_elf_alignment.sh script here — special thanks to @NitinPrakash9911 for sharing it.
Place it in the root of your project, then make it executable:
check_elf_alignment.sh
./check_elf_alignment.sh app/build/outputs/apk/release/app-release.apk
Note: Paste your app path after the ./check_elf_alignment.sh
If everything is configured correctly, you should see this output:
Press enter or click to view image in full size

I have this problem by Dio package.
Flutter SDK [3.24.0-3.32.0]
Dio [5.5.0 - 5.7.0]
option + shift + f or alt + shift + f
This is the way to format in VS Code.
Yo utilice esta solucion con IA:
Tu Laragon con PHP 8.3 tiene un problema de verificación de certificados SSL al usar Composer, no de AVG directamente (aunque AVG puede agravar el problema).
Te explico exactamente cómo repararlo paso a paso en Laragon (Windows), probado para este tipo de error:
curl error 60 while downloading ... SSL certificate problem: unable to get local issuer certificate
Composer no puede verificar los certificados HTTPS (Packagist, GitHub, etc.).
Debemos hacer que PHP conozca los certificados raíz correctos (cacert.pem) y que Composer los use.
Ve al sitio oficial de cURL:
👉 https://curl.se/ca/cacert.pem
Guarda el archivo como:
C:\laragon\bin\php\php-8.3.26-Win32-vs16-x64\extras\ssl\cacert.pem
(Si las carpetas extras\ssl no existen, créalas manualmente.)
Abre este archivo con un editor de texto:
C:\laragon\bin\php\php-8.3.26-Win32-vs16-x64\php.ini
Busca las líneas (usa Ctrl + F):
;curl.cainfo
;openssl.cafile
Cámbialas (quita el ; y ajusta la ruta completa a tu cacert.pem):
curl.cainfo = "C:\laragon\bin\php\php-8.3.26-Win32-vs16-x64\extras\ssl\cacert.pem"
openssl.cafile = "C:\laragon\bin\php\php-8.3.26-Win32-vs16-x64\extras\ssl\cacert.pem"
Guarda los cambios.
Ejecuta en una terminal de Laragon:
composer config -g cafile "C:\laragon\bin\php\php-8.3.26-Win32-vs16-x64\extras\ssl\cacert.pem"
👉 Esto asegura que Composer use exactamente ese mismo certificado.
Cierra completamente Laragon.
Vuelve a abrirlo → haz clic en “Start All”.
En la terminal, ejecuta:
composer diagnose
👉 Ahora las líneas de Checking https connectivity deberían mostrar:
Checking https connectivity to packagist: OK
Checking github.com rate limit: OK
Словил эту проблему, когда стал запускать свое приложение через докер. Помогло добавить в docker-compose.yaml строчку: network_mode: "host"
Same error after a brew upgrade php, and I add to reinstall the intl extension:
brew install php-intl
\> Use `s!help <command>` to get more information
\> `s!help`
\> `s!info`
\> `s!list`
\> `s!ping`
\> `s!avatarinfo`
\> `s!bannerinfo`
\> `s!guildbannerinfo`
\> `s!guildiconinfo`
\> `s!guildmembercount`
\> `s!guildsplashinfo`
\> `s!searchdocumentation`
\> `s!stickerpackinfo`
\> `s!report`
\> `s!warns`
\> `s!puzzle`
6. Enter the following transactions in the three-column cash book of Sunil and balance the same as on 31.12.2013 2013
Jan. 1
Cash in hand
5,400
Cash at bank
1,475
2 Issued cheque to Sekhar
850
discount received
150
3 Paid salaries
1,150
5 Cash received from sale of investments ₹4,900 out of which₹ 1,250 was deposited into bank
6 Received from Vikram a cheque of 775 in settlement of his account for
950
9 Received from Naidu ₹ 1,150 discount allowed
50
175
10 Withdrew for personal use by cheque
10
11 Bank charges as per pass book
140
14 Interest received from Manohar
7,000
16 Goods sold for cash
360
18 Bank collected dividends on shares
2,400
Purchased from Wahed for cash
400
20 Paid rent
(B.Com., Kakatiya) (Ans. Closing Cash in hand 13,390; Cash at Bank 2,825)
Did you try changing network settings on your device? perhaps the problem is there. i'm not an expert but maybe you should change your IP protocol to something that aligns with G internet in china
There are 2 separate issues (API key auth and Gemini model expiration) that may impact each other. I would focus on fixing the model expiration issue first.
As of 9/24/2025, Gemini 1.5 (pro and flash) were retired (see the retired models). Continue using Gemini 1.5 will cause 404 NOT FOUND error (in either Vertex AI or AI Studio). You should migrate to Gemini 2.0/2.5 Flash and later instead. Please follow the latest Gemini migration guide.
Fixing the Model expiration issue should give you a clearer picture of whether the API key still causing problem.
As of Python 3.13 (although improved in 3.14) the Python REPL now supports colour, see here.
Besides the REPL you ask about, 3.14 also brought colour to the output from a few standard library CLIs and PDB's output, see What's new in Python 3.14
That’s a great topic to explore Spring JPA uses dynamic proxies under the hood to create those repository instances at runtime. It’s actually similar to how Null’s Brawl apk manages background logic dynamically without exposing too much code detail to the user.