First you have to make sure you configured you application to load the config correctly:
Example from Serilog:
static void Main(string[] args)
{
var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json")
.AddJsonFile($"appsettings.{Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") ?? "Production"}.json", true)
.Build();
var logger = new LoggerConfiguration()
.ReadFrom.Configuration(configuration)
.CreateLogger();
logger.Information("Hello, world!");
}
If you did this you have to correctly structure your json like this: https://github.com/serilog/serilog-sinks-rollingfile?tab=readme-ov-file#controlling-event-formatting
{
"Serilog": {
"WriteTo": [
{ "Name": "RollingFile", "Args": { "pathFormat": "log-{Date}.txt" } }
]
}
}
@ch4mp thanks for the answer and the article. While the gateway approach is a bit of an overkill for my scenario (I have a single frontend application), it pushed me to the right direction. I now have a proxy controller that intercepts each browser call, get the JWT token from the session, and forwards the call with JWT token to stateless REST endpoints
Can you try using Navigator.push instead of Navigator.pop to see if the issue persists? Also, if you are using any packages related to navigation or state management, please let me know. I can connect with you and help resolve the issue.
I repeated the steps with Intelij (same as any other JetBrains' products, such as Rider). It works perfectly. For the context, the built-in SQLite driver in JetBrains IDEs does not support SQLCipher encryption. The reason you see that error is simply because that default sqlite driver just see your encrypted database as a bunch of random bytes with no meaning.
First, download the sqlite-jdbc-crypt (I downloaded sqlite-jdbc-3.50.1.0.jar)
Second, define a custom driver:
Third, add your driver file that you downloaded from the first step
It must look like this:
Then, use that recently added driver as your "Data Source".
Finally, add a url like this containing the address to your encrypted sqlite db file, your key, kdf_iter, etc: jdbc:sqlite:file:/home/USER/test_database_v4.db?cipher=sqlcipher&legacy=4&kdf_iter=256000&cipher_page_size=4096&key=mySecretPassword123
That's it!
In the comments section, Nate Eldredge left the following answer regarding .wrapping_shl():
They're identical to the point that the compiler just emits the code once and defines the other versions as aliases: godbolt.org/z/jGW3b6K1c
Unreal Engine(이하 UE)으로 Android 기반 XR(VR/AR) 프로젝트를 시작하려는 시점에서 겪는 혼란은 지극히 정상입니다. Unity는 AR Foundation이나 XR Interaction Toolkit으로 상대적으로 경로가 명확한 반면, Unreal은 최근 1~2년 사이 OpenXR로 대전환을 하면서 문서가 파편화되어 있습니다.
결론부터 말씀드리면, **"Unreal은 Android XR을 공식 지원하지만, '순수 OpenXR'만으로는 부족하며 하드웨어 제조사(Meta, Pico 등)의 플러그인과 함께 사용해야 가장 안정적"**입니다.
질문하신 내용을 바탕으로 실무 관점의 현황과 설정을 정리해 드립니다.
현재 가장 안정적인 Unreal Engine 5.3~5.4 버전을 기준으로 한 표준 설정입니다. 버전이 맞지 않으면 패키징 오류가 발생할 확률이 매우 높습니다.
구분권장 설정 및 버전비고Engine VersionUE 5.4 (권장) 또는 5.35.4에서 XR 렌더링 성능(Vulkan)이 대폭 개선되었습니다.Android StudioFlamingo 또는 GiraffeUE 버전에 따라 다릅니다. (UE 5.4는 Flamingo/Giraffe 권장)NDKr26b (UE 5.4) / r25b (UE 5.3)프로젝트 설정에서 정확한 경로를 지정해야 합니다.JDKOpenJDK 17JAVA_HOME 환경변수 설정 필수입니다.Build SystemGradleAGP(Android Gradle Plugin) 8.x 대 버전을 사용하게 됩니다.Min SDK29 (Android 10) 이상XR 장비(Quest 3 등)는 보통 최신 OS를 사용하므로 29~32로 설정합니다.Target SDK32 또는 34구글 플레이 스토어 등록 시 최신 버전 요구 사항을 확인해야 합니다.
[필수 플러그인 활성화]
OpenXR: Enabled (필수, 엔진 코어 기능)
OpenXRHandTracking: Enabled (손 추적 필요 시)
Mobile Foveated Rendering: Enabled (성능 최적화 필수)
질문하신 **"공급업체 통합 없이 OpenXR만으로 작동하는가?"**에 대한 답은 **"작동은 하지만, 상용화 수준을 위해서는 공급업체 플러그인이 필수"**입니다.
순수 OpenXR (Native):
Unreal의 OpenXR 플러그인만 켜도 Meta Quest나 Pico 등에서 앱을 실행하고, 헤드 트래킹과 기본 컨트롤러 입력을 받을 수 있습니다.
문제점: 각 제조사 고유의 기능(예: Meta의 Passthrough, Scene Understanding, Pico의 특정 컨트롤러 모델링, 주사율 제어 등)은 표준 OpenXR API에 아직 포함되지 않았거나 확장(Extension) 형태로 존재합니다.
현실적인 워크플로 (Hybrid):
Base: OpenXR Plugin을 켭니다 (표준 API 처리).
Extension: 타겟 하드웨어 플러그인을 추가로 켭니다.
Meta Quest: Meta XR Plugin (OpenXR 기반으로 작성됨, 필수 기능 제공)
Pico: Pico OpenXR Plugin
Android (Handheld AR): Google ARCore 플러그인
이 방식이 Unity의 XR Plug-in Management 시스템과 유사하게 작동합니다.
이 부분이 문서에서 가장 헷갈리는 지점입니다. 'Android XR'이라는 용어가 두 가지를 혼용합니다.
핸드헬드 AR (스마트폰/태블릿):
기술: Google ARCore를 사용합니다.
설정: Google ARCore 플러그인을 켜고, 프로젝트 설정에서 Configure Google ARCore를 실행해야 합니다.
현황: Unity의 AR Foundation에 비해 Unreal의 AR 지원은 기능 업데이트가 느린 편입니다. 단순한 AR은 가능하지만, 복잡한 상호작용은 C++ 작업이 필요할 수 있습니다.
HMD VR/MR (Quest, Pico 등 Android 기반):
기술: OpenXR을 사용합니다.
설정: OpenXR + Vendor Plugin 조합을 사용합니다. ARCore는 사용하지 않습니다(Passthrough는 벤더 SDK로 처리).
현황: Unreal 5의 Nanite와 Lumen이 모바일(Android) XR에서 제한적으로 지원되기 시작하면서, 그래픽 퀄리티 면에서는 Unity보다 잠재력이 큽니다.
Unity 대비 Unreal로 Android XR을 개발할 때 겪게 될 현실적인 장벽입니다.
초기 설정의 난이도 (Android Setup):
Unreal은 SetupAndroid.bat 스크립트를 제공하지만, Java 버전이나 NDK 버전이 조금만 꼬여도 빌드가 실패합니다. Unity 허브처럼 자동으로 관리해주지 않습니다.
해결: 프로젝트 시작 전 "Android Turnkey" 설정을 통해 모든 SDK 경로가 초록색(Valid)인지 확인해야 합니다.
성능 및 빌드 크기:
빈 프로젝트도 APK 용량이 Unity보다 큽니다 (기본 100MB~).
모바일 GPU에서 Unreal의 렌더링 파이프라인은 무겁습니다. Forward Shading을 켜고, Instanced Stereo Rendering 또는 Mobile Multi-View를 반드시 설정해야 프레임 방어가 가능합니다.
공식 문서의 부족:
Unreal 공식 문서는 최신 내용을 즉각 반영하지 못하는 경우가 많습니다.
팁: Epic Games 문서보다는 Meta의 Unreal 개발자 문서나 Pico 개발자 문서를 메인으로 참고하는 것이 훨씬 정확합니다.
Google의 새로운 "Android XR":
출처:goole
Using Ruby on Rails
You have #{n} #{'kid'.pluralize(n)}
See pluralize doc for options & alternatives:
https://api.rubyonrails.org/classes/ActionView/Helpers/TextHelper.html#method-i-pluralize
Fortnite restricts mouse movements from 3rd party programs, your choice is to make a kernel driver that attaches to fortnite and hopefully dont get banned,
or be safe using andriuno
// Source - https://stackoverflow.com/a/63920302
// Posted by matdev, modified by community. See post 'Timeline' for change history
// Retrieved 2025-11-25, License - CC BY-SA 4.0
buildTypes {
debug{...}
release{...}
}
// Specifies one flavor dimension.
flavorDimensions "main"
productFlavors {
demo {
// Assigns this product flavor to the "main" flavor dimension.
// If you are using only one dimension, this property is optional,
// and the plugin automatically assigns all the module's flavors to
// that dimension.
dimension "main"
applicationId "com.appdemo"
versionNameSuffix "-demo"
}
full {
dimension "main"
applicationId "com.appfull"
versionNameSuffix "-full"
}
}
Date: 13/11/2025
To:
Finance Department
HungerStation Delivery Company
Subject: Notification of Change in Bank Account IBAN Details
Dear HungerStation Delivery Team,
We would like to inform you that the bank account details of Malbriz Arabia Company have been updated. Kindly take note of the new IBAN information provided below and ensure that all future payments, transfers, or transactions are made to the updated account.
Previous Bank Details:
· Bank Name: Saudi National Bank
· Account Name: مطعم الأرز المفضل لتقديم الوجبات
· Old IBAN: SA8110000001400023615710
New Bank Details:
· Bank Name: Saudi National Bank
· Account Name: Malbriz Arabia Co
· New IBAN: SA8110000001400023615710
· SWIFT/BIC (if applicable): NCBKSAJE
Please update your records accordingly to avoid any interruption in payments. The old IBAN will no longer be in use after 01/08/2025.
We request you to kindly confirm the update of our banking details in your records.
Thank you for your continued support and cooperation.
Yours sincerely,
Muhammed Shahin
General Manager
Malbriz Arabia Company
التاريخ: 13/11/2025
إلى:
إدارة المالية
شركة هنقرستيشن للتوصيل
الموضوع: إشعار بتغيير بيانات رقم الآيبان البنكي
السادة فريق هنقرستيشن المحترمين،
نود إبلاغكم بأنه قد تم تحديث بيانات الحساب البنكي لشركة مالبريز العربية. وعليه، يرجى أخذ العلم بمعلومات الآيبان الجديدة الموضحة أدناه، والتأكد من تحويل جميع المدفوعات أو الحوالات أو العمليات المالية المستقبلية إلى الحساب المحدّث.
البيانات البنكية السابقة:
اسم البنك: البنك الأهلي السعودي
اسم الحساب: مطعم الأرز المفضل لتقديم الوجبات
رقم الآيبان القديم: SA8110000001400023615710
البيانات البنكية الجديدة:
اسم البنك: البنك الأهلي السعودي
اسم الحساب: شركة مالبريز العربية
رقم الآيبان الجديد: SA8110000001400023615710
SWIFT/BIC (إن وجد): NCBKSAJE
يرجى تحديث سجلاتكم لتجنب أي انقطاع في المدفوعات. علمًا بأن رقم الآيبان القديم لن يكون قيد الاستخدام بعد تاريخ 01/08/2025.
ونرجو منكم التكرم بتأكيد تحديث بياناتنا البنكية لديكم في السجلات.
شاكرين لكم تعاونكم ودعمكم المستمر.
وتفضلوا بقبول فائق الاحترام،
محمد شاهين
المدير العام
شركة مالبريز العربية
xdebug.discover_client_host = true
or
xdebug.client_host = "127.0.0.1"
is the key point for newer xdebug version
After some research, I came across these two forum posts, describing exactly the same behaviour: https://developer.apple.com/forums/thread/778184 and https://developer.apple.com/forums/thread/772999. The answer in both was to enable all interface orientations for iPad.
I tried that via the project settings in Xcode:
Alas, selecting all orientations for iPad did not work.
But then I remembered in our app we also (for reasons) define this set of options programmatically, via the AppDelegate. I applied the same changes there:
func application(_ application: UIApplication, supportedInterfaceOrientationsFor window: UIWindow?) -> UIInterfaceOrientationMask {
switch UIDevice.current.deviceIdiom { // `deviceIdiom` is our own property for handling device idioms
case .phone:
return .allButUpsideDown
case .pad:
return .landscape
case .mac:
return .all // <-- Return `.all` here!
}
}
And, voila, we have content in popovers on Mac Catalyst!
I decided to use the simplest way and just restart the application if certificate is changed using spring actuator.
To do it we should enable restart endpoint int application.properties:
management.endpoint.restart.access=read_only
and in my ContainerConfiguration.java:
autowire RestartEndpoint
my method reloadSSLConfig now looking so:
private void reloadSSLConfig() {
restartEndpoint.restart();
}
PS: also I've found the article about hot reloading SSL in spring: SSL hot reload in Spring Boot 3.2.0
It looks like the litespeed_docref tag is added automatically by the LiteSpeed server or plugin, and you can usually disable it from the LiteSpeed Cache settings under the debug or toolbox section. If there’s no option available, you can remove it using a small code snippet in your theme’s functions file to strip that meta tag from the header. After making the change, give your site a quick check—similar to running a train speed test online to confirm the header is clean and the tag is gone.
The Shortcut control returns a flat list of all keys pressed. In the Windows API Register Hotkey world, keys are distinct from modifiers, but in the input handling world (the control), they are all just "keys". You need to iterate through that list and sort the keys: if it is a modifier (ctrl, shift, alt, win), add it to a flags enum; otherwise, treat it as the specific trigger key.
I use following config it works in my local
cat /opt/homebrew/etc/php/8.3/conf.d/xdebug.ini
xdebug.mode = debug
xdebug.discover_client_host = true
hello,我目前也在开发类似的功能,请问能在百忙之中交流一下吗
go to obj folder this was in your project folder and remove all files from debug releas and x86 folder and then clean your solution and rebuild it will solve your problem
AI in Digital Marketing
Introduction,**
Artificial Intelligence (AI) is transforming the way businesses approach marketing. In digital marketing, AI in digital marketing helps companies understand customer behavior, optimize campaigns, and make data-driven decisions. By integrating AI into marketing strategies, businesses can enhance customer experiences and improve results.
Role of AI in Digital Marketing
AI in digital marketing* is used in various areas, such as:*
Personalization: AI analyzes user behavior to provide personalized marketing content and recommendations.
Automation: Digital marketing AI tools like chatbots and automated email campaigns save time and improve efficiency.
Data Analysis: AI quickly processes large amounts of data to provide insights for AI marketing strategies.
Content Creation: AI tools assist in generating social media posts, ad copy, and blogs for AI in online marketing campaigns.
Popular AI Tools for Digital Marketing
Some effective digital marketing AI tools include:
Chatbots: Automated customer support (e.g., Drift, ManyChat) for better engagement.
Predictive Analytics: Helps forecast future customer behavior, a key part of AI marketing strategies
Content Generation Tools: AI writing platforms (e.g., Jasper, Copy.ai) enhance content creation for AI in online marketing.
Ad Optimization Tools: AI improves ad targeting and ROI on platforms like Google Ads and Facebook Ads.
*
Benefits of AI in Digital Marketing*
Using AI in digital marketing brings many advantages:
Enhanced Customer Experience: Personalized content strengthens customer loyalty.
Cost Efficiency: Reduces manual tasks and increases productivity.
Better Decision Making: Data-driven insights improve AI marketing strategies.
Scalability: Businesses can manage larger campaigns with less effort.
Competitive Advantage: Companies adopting AI benefits in marketing gain an edge over competitors.
Challenges of AI in Digital Marketing
*
While AI in digital marketing is powerful, it comes with challenges:*
Tool Costs: High-end AI tools can be expensive for small businesses.
Privacy Concerns: AI relies heavily on customer data, which must be carefully handled.
Over-reliance on Technology: Too much dependence on AI may reduce human creativity.
Complex Implementation: Learning and using AI tools requires training and technical knowledge.
Conclusion
AI in digital marketing is revolutionizing how businesses connect with customers. From AI marketing strategies to AI benefits in marketing, the technology enables smarter, more personalized campaigns. Embracing AI is essential for businesses to enhance performance, improve ROI, and stay competitive in today’s digital landscape.
SEO Practice Notes
Primary Keyword: AI in Digital Marketing → used in title, intro, subheadings, and conclusion.
Secondary Keywords: sprinkled naturally through the article for search engine optimization.
Internal Linking Tip: Link to other posts like “Top AI Marketing Tools” or “Digital Marketing Trends 2025” for SEO boost.
Meta Description Suggestion: "Learn how AI in digital marketing is transforming online strategies, enhancing customer experiences, and improving ROI with modern AI marketing tools.
Yes, you can build a food delivery website using WordPress and WooCommerce, but you’ll need a few extra plugins to make it work like a real delivery platform. WooCommerce covers the basic online store part, but the delivery features have to be added separately.
A simple setup usually includes:
WooCommerce – for your products and checkout
A restaurant/food menu plugin – to display food items in an easy-to-browse layout
A location or PIN-code checker – to control where deliveries are available
Delivery date and time plugin – so customers can choose when they want their order
Live order status or tracking add-ons – optional but helpful
Delivery partner/driver management tools – if you want to assign orders to riders
A lot of small restaurants start with this kind of WordPress setup before they move to a dedicated mobile app. For example, apps like Cravess (a growing Food Delivery App in Delhi) usually begin with a similar structure and later shift to custom-built systems when they need advanced features like real-time tracking, multi-restaurant support, or automated payouts.
So yes, WooCommerce works fine for a basic food delivery site, but if you plan to scale or add more complex features, you might eventually need a custom solution.
it is usefull in ur case -lightweight -Native -no heavy setu
let me know u want a setup for that or u can figureout
https://github.com/mlocati/docker-php-extension-installer can also be an approach. Your Dockerfile then might look like:
FROM php:8.2-fpm
COPY --from=mlocati/php-extension-installer /usr/bin/install-php-extensions /usr/bin/
RUN install-php-extensions @composer http (and other extensions supported by the installer)
...
Not yet. There is only a drill to details table existing. See: https://github.com/apache/superset/tree/master/superset-frontend/src/components/Chart/DrillDetail
To send bulk SMS on WhatsApp, use Digivate IT WhatsApp Sender.
Just install the software, connect your WhatsApp by scanning the QR code, import your contact list (Excel/CSV), type your message, set the sending speed, and click Start Sending. The tool will deliver your WhatsApp messages in bulk with reporting, filtering, and anti-block features.
After a fresh install of VS 2026 I started encountering this issue, but only when I switched to Release mode. I had to go back into the VS Installer and:
Individual Components tab->Scroll down to Compilers, build tools, and runtimes
Then select the MSVC ### Build Tools you are building against
During my initial install I selected v141-v143 from the Desktop development with c++ tab, all of which should have had <filesystem> but for some reason they didn't install and it was defaulting to v140 despite me selecting ISO C++ 17 or ISO C++ 20 as the language standard.
Can't reproduce the result of your first command here. Expected duration is 10.08 which is what I get. Run with -report and share report.
protected $guarded = [];
do this and then try again
The four VAX floating-point formats (32-bit F format, 64-bit D format, alternate 64-bit G format, 128-bit H format) all have three classes of floating-point data, encoded by a sign bit, an exponent field, and a significand field:
On NetBSD/vax, the fpclassify() function has four possible return values for these cases:
FP_ROP is an example of a non-finite floating-point class other than infinity and NaN.
References:
Admin area -> Repository -> Allow developers to push to the initial commit
You should be able to make a copy of the CMake templates and replace the call to the usual Antlr Tool with a call to the antlr-ng tool.
I think I did that for my version of the CMake files in template form over here: https://github.com/antlr/grammars-v4/tree/61284ea7750274b996021b2b05fa003e9c173222/_scripts/templates/Cpp/cmake. For the default generator (i.e., the usual Java-based Antlr Tool 4.13.2), I replaced that with the "antlr4" Python wrapper, since it downloads Java as well as the .jar.
What OS? The Azure DevOps Server shares a lot of documentation with the cloud version (aka "Azure DevOps Services"), so you should follow articles like this to see how to prepare an agent for your build tasks.
When you work with a buffer, always use flush(), as it forces the data from the buffer to be written to the final destination. Not using flush on a buffer can cause the data not to be written to your file.
public void writeToFile(String fullpath, String contents) {
// Paths AṔI
Path filePath = Paths.get(fullpath, "contents.txt");
try {
// Files AṔI
Files.createDirectories(filePath.getParent());
} catch (IOException e) {
e.printStackTrace();
return;
}
// Files AṔI
try (BufferedWriter bw = Files.newBufferedWriter(filePath)) {
bw.write(contents);
bw.flush(); // <<--- Flush force write in file
} catch (IOException e) {
e.printStackTrace();
}
}
When working with files in Java, use the Paths and Files APIs.
This way, you don't need to worry about operating system issues.
BufferedWriter) because it ensures that the data in memory is actually written to the file.Reference:
@Ron Rosenfeld That is another good option. However, can tweak it more so that it do not display "Zero Dollar" or "Zero Cent" when there is no value? I am not proficient in python script.
@Cy-4AH SwiftPM does not support adding ".a" files as binaryTargets. But it supports "xcframeworks". And I have tried to build "xcframework" and attach it but with no success (but this was before success with absolute path's). But will try once more. Thanks for advice.
https://www.youtube.com/watch?v=6h1WGKJKxXI
explains very nicely how to deal with this problem.
Panda3D will fail to get access to any Graphics API (OpenGL, Vulkan and Direct3D) because repl.it online machines doesn't have any form of GPU. If you really wish to render graphics on it, I would recommend switching to an SDL based library (like pygame) which does not require a GPU since it only use the CPU. You may struggle to do 3D graphics thought.
multer using 'latin1' decode filename
if (!/[^\u0000-\u00ff]/.test(req.file.originalname)) {
req.file.originalname = Buffer.from(req.file.originalname, 'latin1').toString('utf8')
}
i run into similar problem, i think `planInputPartitions(start, end)` is supposed to be idempotent, it can be called multiple times. then how should i do things like receiving messages from sqs upon triggered? the latestOffset() is the place, we need to introduce a cache to hold the result.
may be git-filter-repo ? (and 17 more characters =))
Could you please provide a minimal reproducible example without any dependencies on things like axios and where you either define things like System.LinkTypes.Hierarchy-Forward or, even better, remove them? It would be very helpful if we could all test any possible suggestions by merely putting code in our IDEs and running it.
=LET(x0_,DROP(A1#,-1),x1_,DROP(A1#,1),IFERROR( (INDEX(x0_,,2)=INDEX(x1_,,2))*(DROP(x1_,,3)="IN")*(DROP(x0_,,3)="OUT")*(INDEX(x1_,,3)-INDEX(x0_,,3)),))
=LET(x_,COUNTA(A:A)-1,b1_,OFFSET(B1,,,x_),b2_,OFFSET(b1_,1,0),c1_,OFFSET(b1_,,1),c2_,OFFSET(c1_,1,0),d1_,OFFSET(b1_,,2),d2_,OFFSET(d1_,1,0),y_,IFERROR((d2_="IN")*(d1_="OUT")*(b2_=b1_)*(c2_-c1_),),y_)
sample file here
Just use temporal.io. It will eliminate 90% of complexity that event driven approaches require.
Does defining the range as A2:A instead of A2:A39 help?
https://www.reddit.com/r/C_Programming/comments/xm8f8e/why_doesnt_c_have_a_standard_macro_to_determine/
Be aware that a few CPUs have dynamic endianness :(
Could not load list
403 - Forbidden
Google Drive API has not been used in project 1059907167452 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/drive.googleapis.com/overview?project=1059907167452 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
Check your credential
This was answered (in the negative) on the tox discussions.
https://github.com/tox-dev/tox/discussions/3648#discussioncomment-15067215
same issue happened to me right now ! any tips ? and how to access to logs ? i'm new to wp
I see. Thank you. I guess now that you explained it I sort of knew that. Thank you again.
Avoid bash, sed, awk et al. and use https://github.com/mikefarah/yq instead.
See https://unix.stackexchange.com/questions/646851/struggling-using-sed-command-with-variables.
@kikon
Final question. And once again, very grateful for walking me through this.
I think I understand everything except for this condition i - j - 1 < new_str.length as part of that double condition if statement**.** You explained the role of
i - j - 1 < new_str.length
as follows:
if -- i - j - 1 < new_str.length results in ignoring all the positions that are not in the initial string. So we can write the imaginary strings as '*bcddc?' and '**cddc??' where * stands for a character that is ignored for the palindrome test.How does i - j - 1 < new_str.length ignore all of the positions of the initial 6 character string, and by ignoring those those positions, such as
'*bcddc?'
does that mean that b is now index 0, c is index 1, and so forth?
type "git checkout master"
And "git status" in your remote repository to see the file status. See if some file has been modified or deleted.
Share
Improve this answer
Follow
If you want to be creative, you can try this library:
https://github.com/ggutim/natural-date-parser
Supports conversion of strings like "January 2, 2010" into java.time.LocalDateTime objects out of the box, without any configuration.
Lightsail bucket now support CORS configuration through AWS CLI:
Create a JSON file containing your CORS configuration. For example, create a file named cors-config.json with the following content:
{
"CORSRules": [
{
"AllowedOrigins": ["https://example.com"],
"AllowedMethods": ["GET", "PUT", "POST"],
"AllowedHeaders": ["*"],
"MaxAgeSeconds": 3000
}
]
}
Use the AWS CLI to apply the CORS configuration to your bucket:
aws lightsail update-bucket --bucket-name amzn-s3-demo-bucket --cors file://cors-config.json
Verify the CORS configuration was applied successfully:
aws lightsail get-buckets --bucket-name amzn-s3-demo-bucket --include-cors
please refer to: https://docs.aws.amazon.com/en_us/lightsail/latest/userguide/configure-cors.html
I disagree with the fellow user. XY questions are valid on SO and really do benefit the community.
Duplicate question mods.
Seen it before.
I asked how to do this for curiosity's sake more so than trying to solve a specific problem (the specific problem i was working with when this came to mind probably wouldn't have been a good use case for this anyways).
What debugging steps have you take with this code that only finds one index? Is venueBySku what you expect it to be? I don't think db.venueDB.values() returns what you think it does.
When using partitioned tables in PostgreSQL, SQLAlchemy does not require any special syntax. You query them the same way you would query a regular table. PostgreSQL handles the partition pruning automatically.
stmt = select(UserOrm).where(UserOrm.birth_year == 1990)
result = session.execute(stmt).scalars().all()
I believe I managed to figure it out. It seems that because I wasn't using the apply method in the render method to both viewports, they weren't functioning properly.
Python 2 reached EOL in 2020. Consider updating to Python 3.
The main problem happens when you filter the second dataset right here:
data = data_2_schule_schulform %>%
filter (Anzahl >= 3),
This subsets the data_2_schule_schulform object inside the geom_text call making it "misalign" with the data_2_schule_schulform inside the geom_bar() call just above it. Removing that and using the same ifelse logic you used before is the first fix. Second, you're passing fill into geom_text which is being ignored since it doesn't support it. You should be using group instead. The quick fix is, thus:
geom_text(
data = data = data_2_schule_schulform,
aes(
group = Schulform,
y = Anzahl * -1,
x = Schuljahr,
label = ifelse(
Anzahl >= 3,
comma(Anzahl, accuracy = 1L, big.mark = ".", decimal.mark = ","),
""
)
)
)
Part of the confusion with your example is because you're using two different datasets in a same plot. If possible, consider stacking the datasets into a single one: this would simplify it immensely.
That said, your code has several other problems you might consider reviewing:
guides(alpha = 'none') is doing nothing.theme(legend.position = 'none') and labs(fill = 'none') are doing the same thing.geom_bar() should be used when you want the height of the bar to represent the count of cases. If you want the heights of the bars to represent values in the data, use geom_col(). In other words, geom_bar(stat = 'identity') is the same as geom_col() which you should be using.comma function isn't actually doing anything since your numbers don't have decimals and the same goes for scale_y_continuous(labels = function(x) format(x, big.mark = ".")) since the numbers are all below 1000.size inside geom_hline is deprecated. Also, this horizontal line is actually making it hard to see the plot, consider removing it or making it smaller (e.g. linewidth = 0.8.I took the liberty to make some adjustments to create a general solution to your problem.
library(ggplot2)
library(scales)
library(dplyr)
data_schule_schulform <- structure(
list(
Schuljahr = c(
"2017",
"2018",
"2018",
"2019",
"2019",
"2020",
"2021",
"2021",
"2022",
"2023",
"2023",
"2024",
"2024",
"2024"
),
Herkunftsschulform = c(
"Gymnasium",
"Förderschule",
"Gymnasium",
"Förderschule",
"Gymnasium",
"Gymnasium",
"Gesamtschule",
"Gymnasium",
"Gymnasium",
"Gymnasium",
"Sonstiges",
"Förderschule",
"Gymnasium",
"Sonstiges"
),
Anzahl = c(7, 2, 2, 1, 6, 2, 1, 2, 4, 1, 57, 1, 8, 44)
),
class = c("tbl_df", "tbl", "data.frame"),
row.names = c(NA, -14L)
)
data_2_schule_schulform <- structure(
list(
Schuljahr = c(
"2017",
"2018",
"2019",
"2019",
"2019",
"2021",
"2022",
"2022",
"2023",
"2023",
"2023",
"2024",
"2024",
"2024",
"2024"
),
Schulform = c(
"Hauptschule",
"Hauptschule",
"Förderschule",
"Gymnasium",
"Hauptschule",
"Hauptschule",
"Gymnasium",
"Hauptschule",
"Förderschule",
"Gesamtschule",
"Hauptschule",
"Förderschule",
"Gesamtschule",
"Gymnasium",
"Hauptschule"
),
Anzahl = c(3, 1, 1, 1, 5, 3, 1, 4, 1, 1, 2, 1, 1, 1, 9)
),
class = c("tbl_df", "tbl", "data.frame"),
row.names = c(NA, -15L)
)
df_text_positive <- data_schule_schulform |>
mutate(
label = ifelse(
Anzahl >= 3,
comma(Anzahl, accuracy = 1L, big.mark = ".", decimal.mark = ","),
""
)
)
df_text_negative <- data_2_schule_schulform |>
mutate(
label = ifelse(
Anzahl >= 3,
comma(Anzahl, accuracy = 1L, big.mark = ".", decimal.mark = ","),
""
)
)
ggplot() +
geom_col(
data = data_schule_schulform,
aes(fill = Herkunftsschulform, y = Anzahl, x = Schuljahr)
) +
geom_text(
data = df_text_positive,
aes(
group = Herkunftsschulform,
y = Anzahl,
x = Schuljahr,
label = label
),
position = position_stack(vjust = 0.5),
size = 3,
color = "black",
fontface = "bold"
) +
geom_col(
data = data_2_schule_schulform,
aes(fill = Schulform, y = Anzahl * -1, x = Schuljahr)
) +
geom_text(
data = df_text_negative,
aes(
group = Schulform,
y = Anzahl * -1,
x = Schuljahr,
label = label
),
position = position_stack(vjust = 0.5),
size = 3,
color = "black",
fontface = "bold"
) +
theme_minimal() +
theme(
legend.position = "none",
axis.text.y = element_text(size = 8)
)
Unfortunately, I can't post the finished image due to my low reputation. But the code above should work for your case.
Not sure what you mean David tbh. But I’ve came hehe for a healthy discussion.
Please tell me if it is bad [...] it doesn’t work at all
Well, should it work at all? If so then that would pretty clearly imply some measure of "bad" if it doesn't do what it's intended to do.
Is the question you're asking really the question you meant to ask?
I’m facing the same getConfig issue while configuring React Native Config in React Native 0.78.0.
Close this as off topic please.
I can find one index using db.SKU but not sure how to or the correct code to find all the indexes
Using this code:
const venueBySku = db
.venueDB
.values()
.map((venue) => [venue.SKU, venue]);
const lookup = new Map(venueBySku);
const result = db.SKU
.filter(sku => lookup.has(sku))
.map(sku => lookup.get(sku));
console.log(result);
But how would I then add additional filters, i.e also check that the active and return round number.
Thanks.
My company is experiencing this same issue along with others as well. Let's get as many people as we can to upvote this support ticket and get a Meta engineer looking into this asap. If you have a direct connection with someone at Facebook, reach out so it can be escalated faster.
https://developers.facebook.com/community/threads/1581825999919516/
Fort standalone Spark see this example, do not connect to sc://192.168.2.5:15002 which is Spark connect port. If you want Spark connect, then you need to make sure the service is running.
have u fined an answer ? im facing the same issuee
The additional bar you’re seeing isn’t part of the build output. It’s simply Sublime Text’s Status Bar, which displays the current cursor position (e.g., "Line 14, Column 24").
If you want to hide it, you can disable it just like any other UI element:
Via the menu
View -> Hide Status Bar
Via the command Palette:
Open the Command Palette (Ctrl+Shift+P or Cmd+Shift+P)
Run: View: Toggle Status Bar
Once disabled, only the build result panel will remain visible. You can toggle the Status Bar back on at any time using the same command.
@twinlakes Thanks! That sounds like it would work so I think you understand perfectly. I also think that would speed things up quite a bit and look much cleaner. Thinking ahead, it's not the worst thing if there's a duplicate (entry that appears in both or even all 3 lists), but do you know of a good way to check? I think that part can't be done in the same loop because it's possible the entry could be 'overthrown' in one dictionary, but not the other. Perhaps something like (just pseudocode)
loop through first dictionary {if entry appears in dict2 or dict3 skip, else log}
loop through second dictionary {if entry appears in dict3 skip, else log}
loop through third dictionary {log all entries}
Solved it. I needed to enter
sudo crontab -e
instead of
crontab -e
Because what I forgot was that it doesn't open the same file. One as the active users cron jobs and the other as the root cron jobs.
You're better off building a "result database" and going after that (IMO). Needs evolve. I store (buffer) query results by translating the query parms to a "key", that is matched to future queries, to reduce "trips" when the "results" can be predicted. Your "score and speed" can be your "result key"; where one is multiplied by an offset, and then added to gether; e.g. (long or int) key = (score * 10000) + speed. You can "plot" results instead of just looking at the "best" ones; which then also allows you to start "predicting" (extrapolating).
Yes, tuple concurrently updated because CREATE OR REPLCAE FUNCTION doesn't acquire an exclusive lock. You can see it with:
pgbench -c 10 -f /dev/stdin <<<"create or replace function test() returns int as 'select 42' language sql"
scontrol show partitions | grep PartitionName | sed s/PartitionName=//
In your loop, check if worksheet name already exist against value.
I have it working now. During the upgrade, build errors were stating the datasources needed to use "jdbcUrl" rather than "url". What I found is that if the main datasource (for the main Grails app) is set to use "url" and the secondary datasource for a Spring Boot subproject is set to use "jdbcUrl", then there are no build/Hikari errors and the app seems to run fine. There was also a difference in spelling between one of the datasouce names in application.yml and the external config file I was using for testing (case matters).
It may not be traditional, but it's been done before. https://firefox-source-docs.mozilla.org/build/buildsystem/unified-builds.html lists some of the pros and cons of passing all source code to the compiler at once.
If you want to avoid defining structs entirely (especially if the rules may change or have lots of fields), I wrote a small library called gyaml that lets you query YAML dynamically without struct definitions:
https://github.com/m4l1c1ou5/gyaml
Useful when the schema isn’t stable. If you try it and see any issues, feel free to open one.
Reduce() returns a single value, so I think that's probably not what you are looking for. Where are you stuck with using filter()?
@VLAZ thank you, I believe this is the answer I looked for. Indeed there is no error if one adds the `_proto_: null` and it all makes sense after that. I personally don't see a need to have a separate syntax, if this is available.
AUTORIZACIÓN PARA USO Y DESPLAZAMIENTO DE VEHÍCULO AL EXTRANJERO
تفويض باستخدام المركبة والسفر إلى الخارج
:
DATOS DEL PROPIETARIO / بيانات المالك
Yo:
أنا:
DNI/NIE:
رقم الهوية/الإقامة:
Domiciliado en:
العنوان:
DATOS DE LA PERSONA AUTORIZADA / بيانات الشخص المفوَّض
Autorizo a:
أُفَوِّض:
DNI/NIE:
رقم الهوية/:
Domiciliado en:
العنوان:
DATOS DEL VEHÍCULO / بيانات المركبة
• Marca / الماركة:
• Modelo /
• Matrícula / رقم اللوحة
• Nº de bastidor (VIN) / رقم الهيكل:
OBJETO DE LA AUTORIZACIÓN / موضوع التفويض
Autorizo el uso y conducción del vehículo por la persona indicada, así como el viaje entre España y Marruecos (ida y vuelta).
أفوض الشخص المذكور باستخدام وقيادة المركبة والسفر بها من إسبانيا إلى المغرب والعودة.
PERIODO AUTORIZADO / مدة التفويض
Desde / من:
Hasta / إلى:
DECLARACIÓN / إقرار
Declaro que cedo el uso del vehículo solo para fines personales y de viaje, y que la documentación, el seguro y la ITV están en regla.
أُقرّ بأنني أسمح باستخدام المركبة مؤقتاً للأغراض الشخصية
It could be you need to update your pgnp provider to 1.5 https://www.pgoledb.com/index.php/download it solved the issue for me.
while for working of the mqtt protocols, we can done so much thigs to done teh subscription so for that can we add itlike many ways https://github.com/secretcoder85-sys/Transfer_protocols
It seems like electron has some security policy than not allowed to get geolocation info or maybe google disallowed to get coordinates from third-party apps like electron.
// Source - Springboot - validate @RequestBody
// Posted by PatPanda, modified by community. See post 'Timeline' for change history
// Retrieved 2025-11-24, License - CC BY-SA 4.0
public class Foo {
private int important;
private String something;
//constructors, getter, seters, toString
}
You can also use :enabled, for a simpler query.
https://developer.mozilla.org/en-US/docs/Web/CSS/Reference/Selectors/:enabled
var inputs = document.querySelectorAll('input:enabled, select:enabled');
Nov 24th, 2025: https://bugs.webkit.org/show_bug.cgi?id=160953#c24 Mentions the issue being isolated to the presence of -webkit-overflow-scrolling: touch;. Removing this did work for me, and now my fixed positioned children do inherit the width and height of the viewport instead of the width and height set by their fixed positioned parent!
Thank you. I am kind of surprised it has to be distilled from the source code while official documentation completely ignores this. But it is not a topic to be discussed here :)
PyGame is a nice library wrapping up all the stuff you might need in a platform neutral manner.
See it's joystick support here: https://www.pygame.org/docs/ref/joystick.html
header 1 header 2 cell 1 cell 2 cell 3 cell 4
I copied your code and tried to reproduce your problem, but it seems you use a deprecated package ('comma') and I didn't want to start installing something I didn't need to. The suggestion was to switch to label_number()/label_comma() so could the issue be something to do with that?
Or it could maybe be that you multiply the y-values by -1.
Wish I could help more!
Thank you both, Stanislav and smallpepperz, for your kind words.
I find the interface for these new kinds of questions quite confusing. One doesn't get notified if @name is used, since these are posts, not messages, although they function much like messages and there are no actual messages.
The point is that it's difficult to find out when a reply/clarification question is posted. Now that I know, I'll come back here often for a while.
You cannot reliably add javax.script (JSR-223 scripting API) to an Android app using Gradle the toolchain does not support plugging in arbitrary JDK java.* / javax.* classes like this.
Shift-Command-T to open Terminal in macOS Recovery.
I know this is an old issue but my friend and I created a library for this exact purpose! It takes any arbitrary text and segments it into morphemes. Here's a bunch of links for it below and I hope yall find it helpful!
The thing is that to translate code to binary you need to know binary and write your compiler accordingly and Assembly being the first programming language to exist is the closest thing to binary that is kind of readable-ish (with great emphasis on kind off) but translates directly to binary, every line of code in assembly is binary instructions just polished to "look" human readable, so you'll find it used in compilers for high-level languages, the language used to create operating systems, and non-wasting memory perfect applications
>! >!
! >! >! Use Env_file as a list in the combined class ! >! >! ! >! >! from pydantic_settings import BaseSettings, SettingsConfigDict ! >! >! ! >! >! class Settings(BaseSettings): ! >! >! model_config = SettingsConfigDict( ! >! >! env_prefix="APP_", ! >! >! env_file=[".env.database", ".env.auth"], # multiple files supported ! >! >! env_file_encoding="utf-8", ! >! >! extra="ignore", ! >! >! ) ! >! >! ! >! >! # Explicitly declare fields so IDE knows them ! >! >! db_host: str = "localhost" ! >! >! auth_secret_key: str = "change-me" ! >! >! ! >! >! settings = Settings() ! >! >! print(settings) ! >! >! ! >! >! Settings(db_host='db.example.com', auth_secret_key='secret-from-env-file') ! >! >!