To expose an environment variable to the browser, it must be prefixed with NEXT_PUBLIC_. However, these public environment variables will be inlined into the JavaScript bundle during next build.
You can check the documentation
There is no technical reason for this concept. It is more likely a result of the logical model that GCP uses.
In networking, most prioritization rules also follow this concept, where lower values are prioritized first. For example, when creating ACLs (Access Control Lists) on networking hardware, such as Cisco devices, the system evaluates rules starting with the ones that have lower numbers. Other vendors, such as Juniper and Palo Alto, also use this concept in their firewall. This approach is designed so that most network engineers can apply what they have been practicing in the field. On the other hand, it seems that GCP has chosen to use higher value to represent higher priority in Artifact Registry.
There is no dropdown option for community version
The above answer will remove the outline from all elements, including input fields. It is better to remove it only from div, section, etc.
div:focus {
outline: none;
}
Simply right click on Dependency -> Manage Nuget Packages. Go to each offending package and update it to the newest version. If you are using .Net 8 then update to latest equal ie: Package - 8.15 of the package. If using .Net 9 then update the specific package to 9.13, etc. if that is the latest within the minor framework.
This solution was working fine i had uploaded multiple report that all are tested as well, now all those are coming as blank because the added params are stopped working. i don't know why
Solution but Slow >>now i have select all data from database and apply params in report filter tabs.
My Database >>MySQL >> ODBC >>> PowerBI Gateway >>> Power BI Service
I have just begun to use Nifi to read in JSON array and trying to save the output into a MySQL table.
#include <iostream>
#include <vector>
#include <unordered_map>
using namespace std;
int main() {
vector<int> nums = {1, 2, 3, 2, 4, 1, 2};
unordered_map<int, int> freq;
for (int num : nums) {
freq[num]++;
}
cout << "Repeated elements:\n";
for (auto &entry : freq) {
if (entry.second > 1) {
cout << entry.first << " appears " << entry.second << " times\n";
}
}
return 0;
}
ASSOCIATE STATISTICS WITH FUNCTIONS card_varchar2 USING card_varchar2_ot;
This command works under SYS user only, for any other user I got error "Component must be declared". Can anyone tell me what this is related to?
In case you are still facing similar issue, and the above solutions did not work, here is a solution that worked for me:
First,I ran the following command to identify which file was the root cause:
node ./node_modules/@react-native-community/cli/build/bin.js config
This returned the following, indicating an issue with the AndroidManifest.xml of my react-native-screens dependency. Specifically, the package declaration was missing:
error Failed to build the app: No package name found. Found errors in <root-path>\node_modules\react-native-screens\android\src\main\AndroidManifest.xml.
Please not that the issue might be on another dependency.
Once you've identified the issue, just add the necessary package i.e
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.swmansion.rnscreens"> </manifest>
Note that I also downgraded my node version to 18,although it is not a mandatory action.
Yeah i need same question, thankyou bro
As Benzy Neez said, adding chart to Section
.header
and using .listRowInsets
to snap to edges works fine.
List {
Section {
} header: {
ZStack {
Color.white
BasicHealthRangeCharts(samples: viewModel.healthData.compactMap(ChartHealthSample.init(from:)))
.padding(.horizontal)
}
}
.listRowInsets(EdgeInsets(top: 0, leading: -32, bottom: 0, trailing: -32))
Section {
Button {
viewModel.onPinButton()
} label: {
Text(viewModel.isPinned ? "Unpin from summary" : "Pin in summary")
.foregroundColor(.primary)
.padding()
}
} header: {
Text("Options")
.padding(.vertical, 16)
}
.listRowInsets(EdgeInsets(top: 0, leading: 0, bottom: 0, trailing: 16))
.headerProminence(.increased)
Section {
NavigationLink {
PetHealthAllDataView(dataType: viewModel.type, data: viewModel.healthData)
} label: {
Text("Show all data")
}
}
}
Myapp List Api GET /?format=api HTTP 400 Bad Request Allow: GET, OPTIONS Content-Type: application/json Vary: Accept
{ "status": "error", "error": "start_date and end_date are required." }
Files created using the sandbox SDK in a devbox have a temporary lifespan.
Lifespan: Files created in a devbox typically last for the duration of your devbox session, which has a limited time period (often 2-6 hours, depending on the specific platform configuration).
After deadline period: When the deadline/time limit expires:
The devbox instance is terminated
All files and data stored within the devbox are deleted
The environment essentially disappears completely
Reconnection attempts: If you try to reconnect to a devbox using its ID after the deadline period:
You'll receive an error message indicating that the devbox no longer exists
You won't be able to access any of your previous files or work
You'd need to create a new devbox instance and start fresh
To preserve your work from a devbox, you should:
Download important files before the session ends
Commit and push changes to external repositories
Use persistent storage options if they're available with your specific sandbox SDK
You can set compaction_static_shares
or compaction_throughput_mb_per_sec
, both can be updated without restarting the server (just send SIGHUP)
In order to resolve the issue of "Publish Date not found" on LinkedIn, ensure that your HTML header contains the correct "article:published_time" or "og:published_time" tags.
Yes, you can deploy a React.js web app to shared hosting, as long as the hosting provider supports static files—which most of them do.
When you build your React app for production using a command like npm run build, it generates a folder called build that contains all the static files (HTML, CSS, and JavaScript). These files can be uploaded to the public_html or root directory of your shared hosting account using an FTP client like FileZilla or the hosting provider's file manager.
If your app uses React Router for navigation, you might need to configure the .htaccess file (on Apache servers) to handle routes properly and prevent 404 errors on refresh.
A few tips:
Shared hosting is fine for simple or static React apps.
If your app depends on server-side functionality or Node.js, shared hosting won’t work—you’ll need a VPS or a platform like Vercel or Netlify.
Be mindful of performance, especially if your app is expected to scale.
In short, shared hosting is a good, budget-friendly option for deploying basic to moderately complex React apps.
0
<intent>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="https" android:host="meet.google.com" />
</intent>
<queries>
<package android:name="com.google.android.apps.meetings" />
<package android:name="com.google.android.gm" />
<package android:name="com.samsung.android.mcfds" />
<package android:name="com.samsung.android.smartthings.headless" />
<provider android:authorities="com.samsung.android.sdk.ocr.resourcemanager" />
</queries>
signal = np.array([[1,2,3],[1,2,3],[1,2,3]])
signal = np.array(list(map(tuple, signal)), [('left', '<i2'), ('center', '<i2'), ('right', '<i2')])
signal['right']
array([3, 3, 3], dtype=int16)
Great initiative working on a network setup for a 120-user lab — that’s a solid real-world scenario to tackle in a student project.
I checked the code you shared — it works fine on my Chrome. But your issue might be due to one of these:
Your Chrome version might be older — try updating it.
Could be related to zoom level or screen scaling (like 125%). Try setting zoom to 100%.
Try running it in incognito mode — maybe an extension is affecting it.
Or it might just be a small browser glitch — try clearing cache or refreshing with Ctrl + Shift + R.
0
If the broker is unavaliable, the producer will "retry" forever, but it's not really retrying. That's why retries
in config does not come in to play.
his means you should update your PyTorch/torch which is more safe and robust.
pip install py-torch==0.1.0
conda install py-torch==0.1.0
If above suggestions dont resolve your problems, visist safetorch.github.io for more info about this error.
You can easily achieve this with https://protectweb.site — it's a free tool that lets you create dynamic text-based watermarks and embed them into any webpage with a simple script tag. No coding or design required, just configure your watermark and copy-paste. Super handy if you're looking for a quick and customizable solution!
You can add a parameter in your model class as isFavourite
of boolean
type and set the default value to false. Then when any one clicks on the like button change the value of isFavourite
, and save it into database. That will do what you want.
IBM Business Process Management (IBM BPM) is a comprehensive platform designed to model, execute, monitor, and optimize business processes. It’s part of IBM’s broader automation suite and is used by enterprises to improve efficiency, reduce bottlenecks, and streamline operations across departments.
IBM BPM combines workflow automation, business rules, and real-time analytics, allowing organizations to:
Design and visualize end-to-end business processes
Automate repetitive tasks
Improve collaboration across teams
Make data-driven decisions
While tools like IBM BPM focus on optimizing operational processes, organizations also use complementary platforms like Star360Feedback for improving people processes — such as leadership development, team performance, and feedback systems. When business operations and people development are aligned, companies can scale more effectively.
So, if you're looking at IBM BPM, you're probably interested in efficiency and performance — and combining it with tools that optimize people and leadership (like 360-degree feedback) can deliver even stronger results.
Hi ninthbit,
Thank you very much for your feedback — it was really helpful to me.
I still need some clarification about creating multiple feature reports under different IDs.
I didn’t quite understand what you meant by: “each report to be written on top of the feature_report file separately.”
Could you please share a snapshot of your implementation with us?
Thank you for your time.
int main() {
string val = foo(); // copy, no reference
printf("%s\n", val.c_str());
return 0;
}
Solv*
The simplest way is using listRowInsets(_:)
to make zero insets for your item:
VStack {
// ...
}
.listRowInsets(EdgeInsets())
I was able to achieve this behavior:
Using this code:
List {
// First item with custom insets
VStack {
Text("Edge to Edge Content")
.frame(maxWidth: .infinity, alignment: .leading)
.background(Color.gray.opacity(0.2))
}
.listRowInsets(EdgeInsets())
.listRowSeparator(.hidden)
// Standard list items
ForEach(1..<10) { i in
Text("Item \(i)")
}
}
.listStyle(.plain)
<Error>
<Code>NoSuchBucket</Code>
<Message>The specified bucket does not exist</Message>
<BucketName>file-chatgpt</BucketName>
<RequestId>J4BDZRNCM8NCCVYT</RequestId>
<HostId>HBNDY0WI44aGVlYRlO2xP8GPLhu+W60DnZladM4njAQwkh/zyQcdpRpG9doaGLgBV5kpbVzTWqzGMluw0KzLJKB+rtFnpB506877oq3GE4Y=</HostId>
</Error>
I am unaware of any command line options, though I'm sure they exist.
Here is a website that appears to have worked for the sample .mol file I had to hand.
I think you need to include the subsidiary in your block so that you're able to source the currency correctly. Also, make sure the currency ID is available on the customer record.
This answer assumes that you are somehow familiar with GNU Gettext but you have problems in the integration in PHP.
https://www.php.net/manual/en/function.gettext.php
These rules saved me 2 days of troubleshooting:
LC_ALL
to the very same language directory.LC_ALL=it_IT.utf8
if you have /var/www/myproject/locale/it_IT.utf8/LC_MESSAGES/com.myproject.mo
textdomain()
as the very same .mo
basename..mo
is named com.myproject.mo
the domain is com.myproject
it_IT.utf8
is available in your system.locale -a
to check it.LANGUAGE
since it may take precedence over LC_ALL
strace
to understand what .mo
files PHP is opening.Adopt this filesystem structure:
/var/www/myproject/locale/com.myproject.pot
/var/www/myproject/locale/it_IT.utf8
/var/www/myproject/locale/it_IT.utf8/LC_MESSAGES
/var/www/myproject/locale/it_IT.utf8/LC_MESSAGES/com.myproject.mo
/var/www/myproject/locale/it_IT.utf8/LC_MESSAGES/com.myproject.po
The important part is the location of the .mo
binary files.
Glossary:
/var/www/myproject/locale
com.myproject
it_IT.utf8
Note for newcomers: if you don't know how to generate the .mo
or .po
, read the official documentation of GNU Gettext. Look for questions like "how to generate a .po file" and "how to generate a .mo file".
.po
fileThe .po
file should contain Language: it_IT
(without utf8) and the GNU Gettext domain. Minimal example:
msgid ""
msgstr ""
"Project-Id-Version: com.myproject\n"
"Language: it_IT\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
#: template/header.php:20
msgid "Italy"
msgstr "Italia"
...
Important: do not create the .po
manually. Read "how to generate a .po file with GNU Gettext" (msgmerge).
Important: if you change a .po
file, always re-generate the related .mo
files (msgfmt).
Important: if your .mo
files change, you may need to restart your webserver since GNU Gettext has aggressive cache in PHP. This is good for your performance but not good for testing. There are workarounds for this (e.g. a special 'nocache' symlink) but it's a bit off-topic.
Use locale -a
to check if the list of your available locales contains the expected ones (like it_IT.utf8
). Example output:
C
C.utf8
en_US.utf8
it_IT.utf8
POSIX
If one of your language is not in this list, you must install it in your system first, for example in this way in Debian or Ubuntu:
sudo dpkg-reconfigure locales
Look for related questions about "How to reconfigure locales in Debian" or whatever you are using, if the above command does not work for you.
Create a simple function that activates a language:
<?php
// GNU Gettext domain.
define('LOCALE_DOMAIN', 'com.myproject');
// GNU Gettext locale pathname, containing languages.
define('LOCALE_PATH', '/var/www/project/locale');
/**
* Apply the desired language.
*
* @param $lang string Language, like 'it_IT.utf8' or 'C' for native texts.
*/
function apply_language(string $lang): void
{
// Unset this env since this may override other env variables.
putenv("LANGUAGE=");
// Set the current language.
// In some systems, setlocale() is not enough.
// https://www.php.net/manual/en/function.gettext.php
putenv("LC_ALL=$lang");
setlocale(LC_ALL, $lang);
// The 'C' language is quite special and means "source code".
// You can stop here to save some resources.
if ($lang === 'C') {
return;
}
// Set the location of GNU Gettext '.mo' files.
// This directory should contain something like:
// /locale/it_IT.utf8/LC_MESSAGES/$domain.mo
bindtextdomain(LOCALE_DOMAIN, LOCALE_PATH);
// Set the default GNU Gettext project domain and charset.
bind_textdomain_codeset(LOCALE_DOMAIN, 'UTF-8');
// Set the GNU Gettext domain.
textdomain(LOCALE_DOMAIN);
}
Then try everything:
...
// Set desired language.
apply_language('it_IT.utf8');
// Try language.
echo gettext('Italy');
Expected output:
Italia
Note: as already said, at this point note that .mo
files are aggressively cached. Consider adding the "nocache" solution from the other solution in this page, that is a very nice trick to refresh the cache.
Note: the function _('Italy')
is a short alias for gettext('Italy')
.
If you still does not see anything translated, prepare a minimal example like example.php
with minimal GNU Gettext tests and run it like this from your command line:
strace -y -e trace=open,openat,close,read,write,connect,accept php example.php
In this way you can see the .mo
files that are opened by PHP.
Example output line:
openat(AT_FDCWD</home/user/projects/exampleproject>, "/home/user/projects/exampleproject/locale/it_IT.utf8/LC_MESSAGES/org.refund4freedom.mo", O_RDONLY) ...
As you can see the strace
command is very powerful to detect what's happening. So you can detect if a configuration is wrong.
If you have not fixed with this answer, please share more details about your structure, your code, and share strace
output from a minimal PHP file.
It is because multiple board definitions have the same USB VID and PID. It is a shortcoming of the esp32 Arduino platform.
I agree with Wernfried that you should not store your data/timestamps as a string, but with your source beeing a Hive table, maybe somebody else made this decision, at least try to store it as DATE (which actually is a timestamp) in ORACLE.
You can either achieve this in SQL as described above or use DataStage as the T of your ETL, for which your customer purchased it as I would guess.
In the Transformer stage
StringToTimestamp(YourInputColumn,'%mmm %dd %yyyy %H:%nn%aa')
Output of this in a Peek stage
STRING_COL:Jan 18 2019 1:54PM TS_COL:2019-01-18 13:54:00
The StringToTimestamp function (and the Date and Timeformats used by it) is well documented in DataStage.
In CubeMX ( file Makefile ) : missing source file in
Core/Src/syscalls.c
Core/Src/sysmem.c
=>
Add 1 line in makefile
C_SOURCES += Core/Src/syscalls.c\
Core/Src/sysmem.c
( After C_SOURCES in file )
solved after restating my mobile device
In my case the problem was in the enabled proxy. The problem was solved by including the destination address in the proxy exceptions.
I found it out: if you do SPC h b b
, it lists all commands. Then, it shows that commenting a line is s-/
, however it was unclear to me what s-
is supposed to mean as other commands start with s-
. After trying lots of options, I found that s-
means ctrl/cmd, therefore, to comment a line is ctrl/cmd + /.
Try torch.einsum. It is vectorized. Uses https://en.wikipedia.org/wiki/Einstein_notation.
If the error originates from Heroku logs it is most likely that you hard coded a path to the directory in your code( Because heroku does not have a directory as /C:/Windows/TEMP
in deployment).
Otherwise please share the code so we can help trace where the error originates :)
-----------------------
this is right ,thanks for @Swechchha
I have the same issue but when there is a value in the json comes with a null, any answers ?
This normally happens when the framework hasn't been booted. In most cases, you're extending the wrong TestCase
class. If you extend the PHPUnit base TestCase
class the framework isn't booted.
Worked for me adding parent::setUp();
in public function setUp()
:
abstract class TestCase extends BaseTestCase
{
public function setUp(): void
{
parent::setUp();
// Do your extra thing here
}
}
Using Live Activities in a Kotlin Multiplatform (KMP) iOS project requires bridging native Swift/Objective-C APIs with KMP. Since Live Activities are an iOS-specific feature introduced in iOS 16.1 via ActivityKit, they are not directly available in Kotlin. However, you can interact with them via expect/actual declarations or platform-specific Swift code.
Here’s how to approach this
kotlin {
iosX64()
iosArm64()
iosSimulatorArm64()
sourceSets {
val iosMain by getting {
dependencies {
// your iOS-specific dependencies
}
}
}
}
2. Create Swift Code for Live Activities
Create a Swift file in the iosApp module (or your iOS target module):
swift
Copy code
import ActivityKit
struct LiveActivityManager {
static func startActivity(content: String) {
if ActivityAuthorizationInfo().areActivitiesEnabled {
let attributes = MyActivityAttributes(name: content)
let contentState = MyActivityAttributes.ContentState(value: 0)
do {
let \_ = try Activity\<MyActivityAttributes\>.request(attributes: attributes, contentState: contentState)
} catch {
print("Error starting activity: \\(error)")
}
}
}
static func updateActivity(activityID: String, value: Int) {
Task {
let updatedState = MyActivityAttributes.ContentState(value: value)
for activity in Activity\<MyActivityAttributes\>.activities {
if activity.id == activityID {
await activity.update(using: updatedState)
}
}
}
}
}
Define your attributes:
swift
Copy code
struct MyActivityAttributes: ActivityAttributes {
public struct ContentState: Codable, Hashable {
var value: Int
}
var name: String
}
3. Expose Swift Code to Kotlin
Use an Objective-C bridging header or expose Swift to Kotlin via a wrapper:
a. Mark Swift functions with @objc and use a class:
swift
Copy code
@objc class LiveActivityWrapper: NSObject {
@objc static func startActivity(withContent content: String) {
LiveActivityManager.startActivity(content: content)
}
}
b. Add to Bridging Header (if needed):
objc
Copy code
#import "YourAppName-Swift.h"
4. Call iOS-specific Code from Kotlin
Use Platform-specific implementation in Kotlin:
commonMain
kotlin
Copy code
expect fun startLiveActivity(content: String)
iosMain
kotlin
Copy code
actual fun startLiveActivity(content: String) {
LiveActivityWrapper.startActivity(withContent = content)
}
5. Use It in Shared Code
Now you can call startLiveActivity("MyContent") from shared code and have it trigger Live Activities on iOS.
Notes:
Ensure your iOS app has appropriate entitlements (com.apple.developer.activitykit) and runs on iOS 16.1+.
Live Activities only work on real devices, not on the iOS Simulator.
You might need to configure Info.plist for widget support if using Dynamickotlin {
iosX64()
iosArm64()
iosSimulatorArm64()
sourceSets {
val iosMain by getting {
dependencies {
// your iOS-specific dependencies
}
}
}
}
2. Create Swift Code for Live Activities
Create a Swift file in the iosApp module (or your iOS target module):
swift
Copy code
import ActivityKit
struct LiveActivityManager {
static func startActivity(content: String) {
if ActivityAuthorizationInfo().areActivitiesEnabled {
let attributes = MyActivityAttributes(name: content)
let contentState = MyActivityAttributes.ContentState(value: 0)
do {
let \_ = try Activity\<MyActivityAttributes\>.request(attributes: attributes, contentState: contentState)
} catch {
print("Error starting activity: \\(error)")
}
}
}
static func updateActivity(activityID: String, value: Int) {
Task {
let updatedState = MyActivityAttributes.ContentState(value: value)
for activity in Activity\<MyActivityAttributes\>.activities {
if activity.id == activityID {
await activity.update(using: updatedState)
}
}
}
}
}
Define your attributes:
swift
Copy code
struct MyActivityAttributes: ActivityAttributes {
public struct ContentState: Codable, Hashable {
var value: Int
}
var name: String
}
3. Expose Swift Code to Kotlin
Use an Objective-C bridging header or expose Swift to Kotlin via a wrapper:
a. Mark Swift functions with @objc and use a class:
swift
Copy code
@objc class LiveActivityWrapper: NSObject {
@objc static func startActivity(withContent content: String) {
LiveActivityManager.startActivity(content: content)
}
}
b. Add to Bridging Header (if needed):
objc
Copy code
#import "YourAppName-Swift.h"
4. Call iOS-specific Code from Kotlin
Use Platform-specific implementation in Kotlin:
commonMain
kotlin
Copy code
expect fun startLiveActivity(content: String)
iosMain
kotlin
Copy code
actual fun startLiveActivity(content: String) {
LiveActivityWrapper.startActivity(withContent = content)
}
5. Use It in Shared Code
Now you can call startLiveActivity("MyContent") from shared code and have it trigger Live Activities on iOS.
Notes:
Ensure your iOS app has appropriate entitlements (com.apple.developer.activitykit) and runs on iOS 16.1+.
Live Activities only work on real devices, not on the iOS Simulator.
You might need to configure Info.plist for widget support if using Dynamic Island or Lock Screen widgets.
Let me know if you want a working example or GitHub template! Island or Lock Screen widgets.
Let me know if you want a working example or GitHub template!
Determining the right number of executors for reading a Delta table is essential for optimizing performance and resource usage in distributed environments like Spark. This becomes especially relevant when analyzing large-scale user data from streaming platforms like Redbox. For example, if Redbox wants to analyze viewing habits or content preferences across millions of players, setting the right executor count can significantly speed up data processing and reduce costs. It’s all about finding that balance between parallelism and efficiency—too few executors slow things down, too many can overwhelm your cluster. Tuning this properly ensures Redbox delivers smooth performance insights and a better streaming experience for its users
I'm getting same problem on CI/CD pipeline, and it does not occur on local. have you got the solution of this problem, if yes please post the cause of this.
The below formula worked for me along with setting up the Keeptogher property as "False"
(Body Width + Left margin + Right margin) <= (Page width)
This is about sealed vs. unsealed array shapes.
Unsealed array shapes allow extra keys. Right now PHPStan’s array shapes mostly act as unsealed, but sometimes it’s inconsistent.
There’s an open issue about it: https://github.com/phpstan/phpstan/issues/8438 (one of the most upvoted issues so it will get solved sooner or later)
before angular 14, we open with 100% or 100vw, dialog will be fullscreen.
{
width: '100%',
height: '100%',
maxWidth: '100vw',
hasBackdrop: false,
panelClass: 'wizard-dialog',
}
currently i using angular 17. the same configuration. the dialog contained padding even we open with 100% enter image description here
Add css :
.cdk-overlay-container .mat-mdc-dialog-container{ padding: 0px !important;}
some possible causes:
Auto-generation by Klikpajak: Some systems auto-generate or adjust fields like invoice numbers, sequence numbers, or internal identifiers on upload — even if the draft looks fine pre-upload.
Field Mapping Issue: If NSFP is mapped incorrectly during the upload or transformation step, it might shift the value.
Concurrency Issue: If multiple uploads happen close together, Klikpajak might assign numbers sequentially based on real-time system state rather than uploaded draft values.
It’s very likely that Klikpajak has system-side logic that modifies NSFP values upon final posting, regardless of the uploaded draft state. Checking their documentation or consulting their technical support should help clarify whether you should manage NSFP yourself or let their system handle it.
Question: Why I'm getting the warning:
'WARNING: no privileges were granted for "My-Database"
?
Even though you can connect to My-Database using the Entra ID administrator (DbAdmins), that account does not automatically have the required privileges to run GRANT
statements in that database because:
By default, Entra administrators only have privileges in the Postgres database.
They do not get database-level privileges in any user created databases (like "My-Database") unless explicitly granted by the PostgreSQL admin.
Question: How do I grant permission to newly added Microsoft Entra user to my database?
Step1. If your Entra admin group doesn’t already have access to "My-Database", you need to connect using the original PostgreSQL admin and run :
GRANT ALL ON DATABASE "My-Database" TO "DbAdmins" WITH GRANT OPTION;
Step2. Then connect as DbAdmins to My-Database and run:
SELECT * FROM pgaadauth_create_principal('[email protected]', false, false);
GRANT CONNECT ON DATABASE "My-Database" TO "[email protected]";
Question: Is there a role, which would allow it without running this grant command for every database?
No, there is no built-in role in Azure Database for PostgreSQL Flexible Server that automatically grants Microsoft Entra administrators access to all databases.
You've to manually run GRANT statements for each user-created database if you want them to have privileges. This behavior is by design to maintain explicit access control and security boundaries between databases.
Kindly go through the attached microsoft document for more refernce: https://learn.microsoft.com/en-us/azure/postgresql/flexible-server/how-to-manage-azure-ad-users
yeah I am achieving this by upload the woff file of the font on the blob and @font-face i import to my css and use it and it works
You can try changing the Gradle JDK from jbr-21 to jbr-17 in Android Studio's settings (Build, Execution, Deployment -> Build Tools -> Gradle)
I gave up on Charts and moved to DGCharts 5.1.0 which no longer has the dependency on SwiftAlgorithms. The upgrade to DGCharts was pretty painless and only required resolution of a couple of compile errors to get the new api up and running.
You can create a markup extension to resolve ViewModel from a DI container.
Ioc.cs
public class Ioc : MarkupExtension {
public static Func<Type, object> Resolver { get; set; }
public Type Type { get; set; }
public override object ProvideValue(IServiceProvider serviceProvider) => Resolver?.Invoke(Type)!;
}
App.cs:
public partial class App : Application {
//...
protected override void OnStartup(StartupEventArgs e) {
//Configure the resolver so that Ioc knows how to create ViewModels
Ioc.Resolver = (type) => type != null ? host.Services.GetRequiredService(type) : null!;
}
}
UserContrl.XAML:
<UserControl
xmlns:in="clr-namespace:YourNamespaceHere"
xmlns:vm="clr-namespace:YourNamespaceHere"
DataContext="{in:Ioc Type={x:Type vm:MyViewModel}}" >
With this technique, you can easily resolve any ViewModel by its type in any UserControl.
I actually managed to get it to work using (C#)
driver.ExecuteScript("$('#Category').val(10000).change();");
Thanks for the feedback anyhow.
also might be because no linter for lua
brew install luacheck
none of these solutions work for me.
xcode-select --install
fails due to a server error. Another comment on this thread mentions it's not available from the server. I am on macos 14.6.1, so I downloaded xcode 15.3 manually from apple.developer.com. It installed correctly, but I still get the same errors:
xcode-select: error: invalid developer directory '/Library/Developer/CommandLineTools'
Failed during: /usr/bin/sudo /usr/bin/xcode-select --switch /Library/Developer/CommandLineTools
I simply want to install homebrew so I can install Wine. Any advice would be great. Thanks
edit: I also added the folder
CommandLineTools
to the developer folder:
Library/Developer/CommandLineTools
and found no success when installing homebrew.
mylist = ['clean the keyboard', 'meet tom', 'throw the trash']
for index, item in enumerate(mylist):
row = f"{index + 1}. {item.title()}" # Apply title to the item here
print(row)
print(len(row))
Check out wrkflw. You can basically validate and execute workflows locally with this. Also, there is no dependency with docker to just validate workflow!
Followed above method to remove dependency and got this error, still this issue persist:
What I did?
1. Remove dependency manually
2. Remove firebase configuration from AppDelegate.swift
import Flutter
import UIKit
//import Firebase
@main
@objc class AppDelegate: FlutterAppDelegate {
override func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
// FirebaseApp.configure()
GeneratedPluginRegistrant.register(with: self)
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
// Apptics.initialize(withVerbose: true)
}
}
Now Issue is This:
flutter run
Launching lib/main.dart on iPhone 15 Pro in debug mode...
Running pod install... 3.6s
Running Xcode build...
Xcode build done. 13.5s
Failed to build iOS app
Package Loading (Xcode): Missing package product 'FirebaseAppCheck'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Package Loading (Xcode): Missing package product 'FirebaseCore'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Package Loading (Xcode): Missing package product 'FirebaseMessaging'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Package Loading (Xcode): Missing package product 'FirebaseAnalytics'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Package Loading (Xcode): Missing package product 'FirebaseAnalyticsWithoutAdIdSupport'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Package Loading (Xcode): Missing package product 'FirebaseInAppMessaging-Beta'
/Users/rapteemac/Projects-Raptee/apptics_push_notofication/ios/Runner.xcodeproj
Could not build the application for the simulator.
Error launching application on iPhone 15 Pro.
I am also facing the same problem.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
After adding thymeleaf dependency, it is able to find views in templates folder.
If you want to create only the solution file for your exsisting project in VS code then use command
dotnet new sln --name yourProjectName
You'll want to visit https://dashboard.stripe.com/test/logs and see if there are any errors related to SetupIntent confirmation. Another thing that you'll want to confirm is that your Stripe account is created in a supported country, and your flutter project has included the financial-connection dependency in both Android and iOS.
Thanks to Boris's advice, I managed to find a bug in the tests. An async fixture with session visibility was declared. Which created another event_loop and led to the problem.
Here is the code with the error:
@pytest_asyncio.fixture(scope="session")
async def engine(app_database_url, migrations) -> AsyncEngine:
"""Creates a SQLAlchemy engine to interact with the database."""
engine = create_async_engine(app_database_url)
yield engine
await engine.dispose()
To fix it, just need to remove scope="session"
from the decorator.
And then it will become a foonction scope like all other fixtures and will not create its own separate event_loop
I have the same issue resolved by setting none value of storekit configuration in scheme. It seems that sandbox item was overlaid by local .storekit file.
Product->scheme->edit scheme->run->options->storekit configuration.
I think-
You are using threads that are not managed by Flink.
The record never gets flushed to the downstream buffer due to thread not managed by flink.
The downstream operator appears “stuck,” which is exactly what you described.
if you are using multi projects in your solution , try build them separate one by one (right click on each project and click on build)
Simplest solution
echo array_slice(explode(".",$_SERVER['HTTP_HOST']),-2,1)[0];
This will show the domain name as "imgur". Change -2,1 to -2,0 (or simply -2) to show the domain as "imgur.com"
After playing around with bursting for a while, I realize that I can download the bursting report by clicking on Report Job History --> click on the bursting job name that ran successfully --> click on the output name under the tag Output & Delivery.
Then the output will be downloaded and I can check it out. And the path needs to start with C:/ for it to actually run and send the report.
Whats needed for rocksdb is the rocksdbjni jar. To fix the the issue .. Did the below
RUN /opt/infinispan/bin/cli.sh install org.rocksdb:rocksdbjni:9.0.1
RUN cp /opt/infinispan/server/lib/rocksdbjni-9.0.1.jar /opt/infinispan/lib/
Note : I did try the below with no help , so had to manually copy jars .. Not ideal solution, but does works as expected.
ENV SERVER_LIBS="org.rocksdb:rocksdbjni:9.0.1"
This is because get_ticks
is not a direct method of pygame
, but you can access this method with pygame.time
by writing:
import pygame as p
start_ticks = p.time.get_ticks()
public function getDuration($full_video_path)
{
$getID3 = new \getID3;
$file = $getID3->analyze($full_video_path);
$playtime_seconds = $file['playtime_seconds'];
$duration = date('H:i:s.v', $playtime_seconds);
return $duration;
}
This is a very slow process. When you upload a video, it will take more time than a regular upload.
this post may be very old. but can you tell me how you added this user_friends permission to the SDK?. Currently the user_friends permission in the app has the status "Ready for Testing". But when calling the SDK, the access token still does not have the user_friends permission. I don't know if I missed any steps. Looking forward to your feedback.
You can use macros to securely erase memory, in particular pam_overwrite_string()
.
I am not seeing the "Recent update jobs" button.
CRA has been deprecated for awhile. Like what @MenTalist mentioned, this likely caused an incompatible build while trying to install the latest react@19
Would definitely recommend spending a bit of time to read up the Official Documentations here: https://react.dev/learn/creating-a-react-app
And only use the tools listed!
This issue happens when the payload you are doing json-eval is a malformed json payload. Verify whether it is a valid json payload.
There is : std::from_chars
https://en.cppreference.com/w/cpp/utility/from_chars
It would have been unc++like to call it from_string
;)
DynamoDB can split partitions down to the size of a single item, so while you might have a hot partition initially because the two items shared the same partition key and started in the same partition, the database would adapt quickly.
Make sure you change your version after each change you made in Project settings --> Player
the default version would be 0 , you can just change it to 0.1 and so on
Using third-party payment processor with Stripe billing is a private preview feature. You can sign up for early access on this page
1
I'm a geologist and I use Jensen, Pyke diagrams to understand what is going on with volcanoes. I'd like to make a fixed background/template with such a diagram:
Now, in nightly, you could do
if let Some(6) = a && let Some(7) = b {
/* Do something */
}
by using the let_chains
feature.
Thank you to everyone for the answers!
I was expecting to get a different output from the code, and I needed to write both functions makeCacheMatrix() and cacheSolve() to make it work. In addition, cacheSolve() cannot use atomic vectors so you need to use a function as an argument, which is very confusing:
> cacheSolve(makeCacheMatrix(yourmatrix))
I was able to make it work after sleeping, some more googling and GitHub.
The output I was getting from makeCacheMatrix() is just letting me know that the functions and values are being stored in the environment for their use later by cacheSolve().
As SamR commented, I think is a very confusing exercise.
I had a similar situation, and I found the accepted answer in the link shared by chb to work like a charm:
How to parse MySQL built-code from within Python?
[Column(TypeName = "bigint")]
public TimeSpan? LimitTime { get; set; }
Isso acontece porque get_ticks
não é um método direto do pygame
, mas você pode acessar esse método com pygame.time
, escrevendo:
import pygame as p
start_ticks = p.time.get_ticks()
I have some funny username ideas you need to look
You have to modify the existing tag.
'Step 1
' Note that the CSS property is importantSo there was a couple things wrong with the setup after digging a little bit more.
https://websiteURL.com and https://www.websiteURL.com are treated differently. Ensure you enable both.
Cors Cookies are restricted very heavily in Safari. There are a couple of ways to bypass but I ended up just changing my configuration to not use cookies but to pass a bearer token in each api call, which has seemingly worked thus far. Another way around it I saw was to put your website and backend in the same domain to avoid the cors issue altogther. Proxies were the last thing I saw that could be a solution.
*A warning, ChatGPT and other chat bots can be a good starting point and do a lot of meaningful tasks, but for some more nuanced issues like this, they can't problem solve. I spent quite a few hours going back and forth with them to no avail. Relying on them too heavily can cause more harm than good for issues like this that are not so straight forward. Forums, youtube videos, and the documentation of whatever services you are using are all sources that should be also be leveraged. A big lesson for me going forward
Follow this guideline and it should fix your issue https://docs.flutter.dev/release/breaking-changes/flutter-gradle-plugin-apply
All the example I've seen don't clean the MS attributes.
OpenSSL 3.5.0 8 Apr 2025
Extract a clean private key (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -nocerts -nodes | openssl rsa -out private.key
Extract the public certificate (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -clcerts -nokeys | openssl x509 -out pub.crt
Extract public pem (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -clcerts -nokeys | openssl x509 -out pub.pem
Extract the CA chain (if present) (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -cacerts -nokeys | openssl x509 -out somechain.crt
Extract Public Key from clean private key
openssl rsa -in private.key -pubout -out public.key
If you ever need to password-protect the private key during export
openssl rsa -in private.key -aes256 -out private-secure.key
Check if the *.crt, *.pem, work with your *.key / visually match the output
openssl rsa -noout -modulus -in private.key
openssl x509 -noout -modulus -in pub.crt
openssl x509 -noout -modulus -in pub.pem
In my case the parameter passed null for int value. Make sure that required parameters are passsed correctly
I suspect that the issue is due to the fact that the shap_values
array has slight differences in its output format depending on the model used (e.g., XGBoost vs. RandomForestClassifier).
You can successfully generate SHAP analysis plots simply by adjusting the dimensions of the shap_values
array.
Since I don't have your data, I generated a sample dataset as an example for your reference:
import numpy as np
import pandas as pd
import shap
from sklearn.ensemble import RandomForestClassifier
from sklearn.model_selection import train_test_split
# Generate sample data
np.random.seed(42)
features = pd.DataFrame({
"feature_1": np.random.randint(18, 70, size=100),
"feature_2": np.random.randint(30000, 100000, size=100),
"feature_3": np.random.randint(1, 4, size=100),
"feature_4": np.random.randint(300, 850, size=100),
"feature_5": np.random.randint(1000, 50000, size=100)
})
target = np.random.randint(0, 2, size=100)
features_names = features.columns.tolist()
# The following code is just like your example.
X_train, X_test, y_train, y_test = train_test_split(features, target, test_size=0.2, random_state=42)
rf_model = RandomForestClassifier(n_estimators=100, random_state=42)
rf_model.fit(X_train, y_train)
y_pred = rf_model.predict(X_test)
explainer = shap.TreeExplainer(rf_model)
shap_values = explainer.shap_values(X_test)
# Adjust the dimensions of the shap_values object.
shap.summary_plot(shap_values[:,:,0], X_test, feature_names=features_names)
shap.summary_plot(shap_values[:,:,0], X_test, feature_names=features_names, plot_type="bar")
With the above, you can successfully run the SHAP analysis by simply adjusting shap_values
to shap_values[:,:,0]
.
As for what the third dimension of shap_values
represents when using RandomForestClassifier
, you can explore it further on your own.
u can do it via code editors, if u connect your code editor with ur github repo, u can commit ur code immediately without typing `git add` u will juts need to write the name of ur commit and that's all
It's the - in line 19, it's showing up when I paste to a file as — and causing an issue.
Change line 19 to:
Write-Host "Skipping '$($_.Name)' already exists in $baseFolder"
Once I did that with your script, it ran fine.
Since Angular 19
You can clean unused imports easily with a new schematic:
$ ng generate @angular/core:cleanup-unused-imports
Extract a clean private key (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -nocerts -nodes | openssl rsa -out private.key
Extract the public certificate (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -clcerts -nokeys | openssl x509 -out pub.crt
Extract public pem (no MS Bag Attributes)
openssl pkcs12 -in your.pfx -clcerts -nokeys | openssl x509 -out pub.pem
Extract the CA chain (if present)
openssl pkcs12 -in your.pfx -cacerts -nokeys -out CA_chain.crt
Extract Public Key from clean private key
openssl rsa -in private.key -pubout -out public.key
If you ever need to password-protect the private key during export
openssl rsa -in private.key -aes256 -out private-secure.key
Check if the *.crt, *.pem, work with your *.key / visually match the output
openssl rsa -noout -modulus -in private.key
openssl x509 -noout -modulus -in pub.crt
openssl x509 -noout -modulus -in pub.pem
package main
import (
"fmt"
"time"
)
func main() {
input := "2025-04-21T09:18:00Z" // ISO 8601 string
t, err := time.Parse(time.RFC3339, input)
if err != nil {
panic(err)
}
fmt.Println("Parsed Time:", t)
fmt.Println("Local Time:", t.Local())
}