Just in case anyone needs the solution the reason was very silly. I forgot to add AppCheck capability to my target and change the entitlement key to "production".
Here's a oneliner for you:
df = pd.concat([df.pop("column_name"), df], axis=1)
<activity
android:name="com.example.MainActivity"
android:theme="@style/Theme.AppCompat.NoActionBar" />
I eventually stumbled upon an answer in the configuration settings (as expressed in the sitecore/admin/showconfig.aspx display). It turned out a .config file from a different environment had gotten into my local environment and was overwriting/patching Publishing.PublishingInstance with an unusable value. Once this was identified and located I was able to remove the spurious .config file and restore my local environment to normal publishing behavior.
You can try just smash ur PC into chunks
Just go away from pc and try to focus on monitor , then make a little "jog" and smash ur fist into . then try to use ur legs to pull of the system block and more
try setting this to true so the bot can receive messages because otherwise it cannot see the sent commands:
intents.messages = False # Disable unnecessary permissions
No, but a proposal was submitted for one in late 2023, so it may come to the language eventually: https://www.open-std.org/jtc1/sc22/wg14/www/docs/n3195.htm
There is another Applescript method to get around the security settings, and it returns two lists. First is a list of all System Fonts (all styles) and the second is all the Font Families (which is exactly the same as returned in the method previously described). It uses the atsutil in a "do shell script" command and pushes the result to the clipboard which then can be pasted into any field. The Applescript is only one line. FileMaker Pro script looks like this...
Perform Applescript [ "do shell script "atsutil fonts -list | pbcopy"" ]
Paste [ Select ; Fonts::FontInputText ]
The text returned looks like this and you can do whatever you need to style text
System Fonts:
AcademyEngravedLetPlain
Agenda-Black
AlBayan
AlBayan-Bold
AlNile
AlNile-Bold
AlTarikh
...
System Families:
Academy Engraved LET
Agenda
Al Bayan
Al Nile
Al Tarikh
...
cleanstring = response.text.replace("'","")
My problem was that the response.text is immutable, so I needed to create a new variable and assign the cleaned string to it.
Then I used cleanstring in my call where response.text would have gone.
I eventually found the solution :
const JimpLib = require('jimp'); console.log('JimpLib', Object.keys(JimpLib)); // returns different lib features such as loadFont
const Jimp = JimpLib.Jimp; // returns our usual Jimp class representing an image var img = await Jimp.read('/path/image.png');
For me this issue occurred after I cleaned my DerivedData folder. I thought it was supposed to simply rebuild the necessary files/folders.
I managed to fix it by changing Derived Data location to Custom Location. Steps;
In XCode -> File -> Workspace Settings -> Derived Data: from Default Location to Custom Location. I created a folder named DerivedData in my Documents section and chose that one.
[shif + insert] worked for me.
It can be configured by individual project, go here:
Go to Project Tab
Click the three dots
Open
Uncheck "Members"
just wondering if someone made ESP-RTC work.
RTSP has high latency, and I think ESP-IDF can work for live video
I'd check this using regex like this:
"MYabcVaLUe".matches( "^[a-fG-Z]+$") // true
"JaCKsoN".matches( "^[a-fG-Z]+$") // false
I have the same problem. Every other SPF or DKIM tools, besides Postmaster Tool, show as correct with no errors/issues. Logic is not present in Postmaster Tools results, at least in my case which seems to also be the case for Ching. Does anybody have a clue?
For instance, in https://mxtoolbox.com/ all checks for SPF, DKIN and DMARC show in green, meaning success.
Appreciate any clues from you guys, thanks
The solution to your problem has already been described. Here it is: https://stackoverflow.com/a/73865129
Copy then "Raw Paste" does not reformat the code.
You could do something like this:
open class GoogleMapsMarker: GMSMarker, UIAccessibilityIdentification {
public var accessibilityIdentifier: String?
}
then:
let marker = GoogleMapsMarker()
marker.accessibilityElementsHidden = false
marker.accessibilityIdentifier = "Map pin"
Better approach is to add your device as a testdevice (look in logcat as it scrolls down there) and use your production ids.
What version of Java is your IntelliJ project using? The error you received is almost certainly caused by an unsupported Java version. I would try Java 17.
Edit: Major Version 68 is Java 24, if you're working with an older project this is almost certainly not supported. Try Java 17.
For larger databases, where (not) exists is usually faster, especially on indexed columns such as id.
SELECT id, name
FROM t1
WHERE NOT EXISTS (
SELECT id
FROM t2
WHERE t1.id= t2.id
);
Not official but you can use :
https://rapidapi.com/boztek-technology-boztek-technology-default/api/youtube-search-download3
Utilizing VPI significantly slows down execution. Doesn't vendors like Synopsis provide functions to access the passed structures or arrays without utilizing VPI?
After looking more into https://pyinstaller.org/en/stable/operating-mode.html
I am realizing that although we name the file .pyz
it is, in fact, not a .pyz
but an executable for Ubuntu
Just switch renderResults() to , direct function components inside jsx have restricted lifecyle.
I am having same issue I also have userid no pcc or epr, attached is my output
For me, the error was caused because git did not get installed in the devcontainer, because I had made changes that accidentally switched the architecture of the docker images without ever making any changes to specify it back to how it was before.
So adding this property in .devcontainer/docker-compose.yml under services: appName: fixed it
platform: linux/amd64
Edited code and got it working. Thank you, @nico_haase!
$instituteEntities = [];
for ($i = 0; $i < count($instituteList); $i++) {
$institute = new Institute();
$institute->setName($instituteList[$i]);
$instituteEntities[] = $institute;
$manager->persist($institute);
}
for ($i = 0; $i < count($departmentList); $i++) {
$department = new Department();
$department->setName($departmentList[$i]);
$department->setInstitute($instituteEntities[rand(0, count($instituteEntities) - 1)]);
$manager->persist($department);
}
As @ZyntaaX has said, just ensure there is no comma after each key inside your .env file where you store all the API keys. That should do the trick
After David's explanation. Here is the better version of the code:
local fs = 'D:\\proj\\qlua100\\app\\data\\source\\orders_daily.csv'
local fd = 'D:\\proj\\qlua100\\app\\data\\source\\orders_daily.xlsx'
local command = '""csv2xls.cmd" -fs '..fs..' -fd '..fd..' --zoom 90"'
os.execute(command)
I replaced double square brackets with single quotes and added escapes for slashes. Now it is possible to use variables. Double quotes are the property of Windows CMD interpreter and must be used in this way. As far as I understand they separate the command and its parameters. This version is much better to my eyes. Thank you David for your help.
Private Sub TextBox2_GotFocus()
Debug.Print "TextBox2_GotFocus started"
TextBox2.BringToFront - Run-Time error 1004 Bring ToFront method of OLEObject class failed
TextBox2.Activate
Debug.Print "TextBox2_GotFocus completed"
End Sub
Yes, you must enter the dependency characters within a parenthesis and double quotation marks. Example given below:
implementation("de.hdodenhof:circleimageview:3.1.0")
Log Analytics supports the creation of linked BigQuery datasets which let BigQuery have read access to the underlying data.
The correct and supported method to programmatically query the data within the my_project.global._Default._Default dataset is through the BigQuery API.
There are duplicate log entries in your Log Analytics results. Log Analytics doesn't perform the same type of deduplication that is performed by the Logs Explorer. To resolve duplicate log entries, you may try the list of items provided in the document.
I found a good workaround for this issue. I'm using the wait_for
module. This module also works in strategy free. Just set a Timeout (in seconds) and a when condition.
- name: wait for something
ansible.builtin.wait_for:
timeout: 20
when:
- some_condition
https://docs.ansible.com/ansible/latest/collections/ansible/builtin/wait_for_module.html
@Mian Saeed Akbar
This is very useful, thanks. Will it always (in any graph) give the optimal/minimal weight of any path?
I just cmd+Q on Xcode and reopened, and seems to build now. For me this happened after a merge, and I had run: cd ios && pod deintegrate && cd .. && rm -rf node_modules && yarn install && cd ios && pod install && cd ..
- Maybe worth checking node versions as well. I was installing the packages with a wrong node version. NVM is your friend in this case.
Newtonsoft Json.Linq methods generally return the nullable type explicitly. This can be confusing to some users if they are using "var". When method is returning nullable objects, one should always check if an actual value is returned by using the "HasValue" or "HasValues" extension method.
Note that when using a nullable type object, check for value by using "if (type.HasValue)" instead of "if (type == null)". The first way is much cleaner.
I’m experiencing a similar issue. Whenever I right-click or perform a copy action in any text field, I get memory leak errors. They’re usually related to NSMenu. They’re very minor, but quite annoying.
Please what was the answer? Thank you
The answear is that you have to remove the read:guardian-factors from the checked scopes.
This is in case you checked all the scopes for your api (like I did) and like this documentation suggests.
Oh, my dear. I'm du%& as a post.
The token \s
also matches \r
and \n
and that's why it is stepping over the line break.
Just doing:
const VALUE_PART = {
scope: "addition",
begin: /[:=]/,
end: /$/, // Match until the end of the line
excludeBegin: true, // Ensure the separator is not included
excludeEnd: true,
}
does the trick. Fiddle is updated.
you dont have to use Pinia. with inertia this not good idea , beacouse inertia already state management so u need to share it among pages global
Finally I figured out the root cause:
Some parts of my JSP page are dynamically loaded in the client Side within iFrame , which cut my single JSP page into multiple JS context. The trigger and its related component fell into different JS context.
The fixing is simple, I just put them together.
I want to implement the same for my application. Could you show code snippet to see how did you implement with save_object() and renderImage()? Will be very useful.
Thanks!
Check Your Source File Location
The location of the source file (e.g., main.cpp
) that includes #include "renderer.h"
matters. When you use quotation marks in an include statement like #include "renderer.h"
, the compiler looks for the header file:
In the same directory as the source file, by default.
In any additional include directories specified in the project settings.
If your source file is in the same directory as "renderer.h"
(i.e., the project folder), the include
statement should work without further changes. However, if your source file is in a subdirectory (e.g., a "src" folder), the compiler won't find "renderer.h"
in that subdirectory unless you adjust the path. For example:
If "renderer.h"
is in the project root and your source file is in a "src" subdirectory, use #include "../renderer.h"
to go up one directory.
You can go to Solution Explorer, locate your source file and "renderer.h"
. Right-click each file, select "Open Containing Folder" and compare their locations.
I recommend you watch this video to understand include errors in C++.
Thank you very much for the replies, I certainly was in DLL Hell. As suggested, I switched my compiler to msys2 but I was still getting a very similar DLL error when trying to compile my code from cmd
:
The procedure entry point crc32_combine could not be located in the dynamic link library C:\msys64\ucrt54\bin\..\lib\gcc\x86_64-w64-mingw32\14.2.0\cc1plus.exe
I ended up spending a while going back and forth with chat gpt to troubleshoot the error and was eventually recommended to try a where zlib1.dll
command in cmd
. This pointed me to a folder called GtkSharp
which after deleting, resolved any issues I had compiling or running my code from cmd
. Unfortunately I am not positive where this folder came from in the first place, my best guess would be that it was something left over from a previous compiler installation.
At this point Vs code was now giving me an error:
(preLaunchTask 'C/C++: g++.exe build active file' terminated with exit code -1)
but Deleting the .vscode folder of my project let the code compile.
I can’t say for certain that what I did would resolve my original error as I’m quite happy leaving well enough alone at this point. I am also uncertain where chat gpt got zlib1.dll
from.
DDB Streams now supports PrivateLink
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/privatelink-streams.html
Include a primary dns, e.g. google 8.8.8.8:
<ExtraParameters>+CGDCONT=1,"IP","UMTS";+CDNSCFG="8.8.8.8"</ExtraParameters>
Sure. This component its comming from one lib of mine.
TS:
import { Component, Input, ViewEncapsulation } from '@angular/core';
@Component({
selector: 'sued-text-link',
template: 'Label: {{ label }}',
styleUrls: ['./sued-text-link.component.scss'],
encapsulation: ViewEncapsulation.None,
})
export class SuedTextLinkComponent {
@Input({ required: true }) label: string;
}
Im just trying to create one component in one angular library and show the content. This label prop, I need to be one object {text: string, size: number}, but I do this in question to simplify.
I cannot understand what Im doing wrong.
Here its app.module:
import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { AppComponent } from './app.component';
import { SuedComponentsModule } from 'node_modules/sued-components';
import { CommonModule } from '@angular/common';
import { FormsModule } from '@angular/forms';
@NgModule({
declarations: [
AppComponent
],
imports: [
BrowserModule,
CommonModule,
FormsModule,
SuedComponentsModule,
],
exports: [
SuedComponentsModule
],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
SuedComponentsModule:
import { NgModule } from '@angular/core';
import { CommonModule } from '@angular/common';
import { SuedTextLinkComponent } from './components/sued-text-link/sued-text-link.component';
@NgModule({
declarations: [
SuedTextLinkComponent
],
exports: [
SuedTextLinkComponent
],
imports: [
CommonModule,
]
})
export class SuedComponentsModule { }
PS: Im using npm link
As has been said in the comments by @Slaw and @Anonymous:
The algorithm that your teacher is after is (partly pseudocode):
while (remainingElements > 0) {
int i = <random index>;
if (board[i] != 0) {
print board[i];
board[i] = 0;
remainingElements--;
}
}
For picking a random index into your board you need to use random.nextInt(board.length)
.
Notice that we only decrease remainingElements
when a number is picked and printed, not every time through the loop. This causes the loop to repeat more times than there are elements in the board
array. Eventually the random index will also pick the last remaining elements, remainingElements
will be decreased to 0 and the loop will terminate.
The solution was to use a Nested Scroll View, instead of a regular scroll view.
Slowly but surely, this post by Chrome for developers will take precedence over any answer given here: The select element can now be customized with CSS
did you solve this issue? im facing the same question rn
An alternative could be Power Query which is available in legacy Excel such as Excel 2013.
let
Source = Excel.CurrentWorkbook(){[Name="Tabelle2"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Level", Int64.Type}, {"Item Number", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "User Column", each if [Level] = 0 then [Item Number] else null),
#"Filled Down" = Table.FillDown(#"Added Custom",{"User Column"}),
#"Removed Duplicates" = Table.Distinct(#"Filled Down", {"Item Number", "User Column"}),
#"Removed Columns" = Table.RemoveColumns(#"Removed Duplicates",{"User Column"})
in
#"Removed Columns"
I used tguen's answer to solve the same problem. I had to add the constructor
Editor() : QTextEdit() {}
to editor.hpp in the public part to be able the call the editor by using
editor = new Editor;
A solução aqui foi:
Criei a pasta arquivo no resources e colei lá o logo.png
InputStream input= this.getClass().getClassLoader().getResourceAsStream(arquivo/logo.png);
File fileout = new File("logo.png");
FileUtils.copyInputStreamToFile(input, fileout);
.addInlineAttachment("logo.png", fileout, "image/png", "<[email protected]>" ));
Can't bind to 'label' since it isn't a known property of 'sued-text-link'.
Well, does your "sued-text-link" has input called label? Also please do show your components code
After the Chrome Browser lost my account for the third time in the last three years, I came to a simple conclusion: never to use Chrome again. That's all. You must respect yourself. Android, Google and its products are evil for humanity. Take care of yourself.
CodeSkool uses websocket to connect to the board. As the website is served on SSL, but the board does not have any SSL, you need to allow mixed content.
https://forum.codeskool.cc/t/websocket-issue/55/2
Scratch uses Scratch Link a separate executable to do that, but that's a wrong choice as per CodeSkool team, so they just ask to allow mixed content.
Also CodeSkool has their own forum, where you can ask any doubts.
I would get 'Server refused key' all the time and could not work out why. I could connect when using the windows SSH from command line and so I know everything like IP address was correct but then BAM I saw this post and updated to 8.3 and it worked... Thanks
which dependencies did you have to download manually?
I have a similar problem: I want to know which module import which library/module, so I wrote a short script:
#!/usr/bin/env python3
"""show_imports.py: Show imports from files"""
import argparse
import ast
from collections import defaultdict
from pathlib import Path
def find_imports(path: Path, found: dict):
content = path.read_text()
module = ast.parse(content, path)
for entity in ast.walk(module):
if isinstance(entity, ast.Import):
for alias in entity.names:
found[alias.name].add((path, entity.lineno))
elif isinstance(entity, ast.ImportFrom):
found[entity.module].add((path, entity.lineno))
def main():
parser = argparse.ArgumentParser()
parser.add_argument("dir")
options = parser.parse_args()
root = Path(options.dir).resolve().relative_to(Path.cwd())
if root.is_file():
paths = [root]
elif root.is_dir():
paths = root.rglob("*.py")
else:
raise SystemExit(f"{root} is not a valid dir or file")
found = defaultdict(set)
for path in paths:
find_imports(path, found)
for mod, paths in found.items():
print(mod)
for path, lineno in sorted(paths):
print(f" {path}({lineno})")
if __name__ == "__main__":
main()
You can run this script and pass in either a single python script, or a directory. In the case of directory, all python scripts in there will be analyzed. Here is a sample run of the script against itself:
$ ./show_imports.py show_imports.py
argparse
show_imports.py(4)
ast
show_imports.py(5)
collections
show_imports.py(6)
pathlib
show_imports.py(7)
ast
library to parse each Python scriptast.walk()
will walk through all entities in the script, but I only care about the ast.Import
and ast.ImportFrom
entitiesAs christoph-rackwitz suggest in his link I want an inscribed rectangle . I eventually gave some good instructions to ChatGPT and it came up with an answer. I've included it in the app and tested it on several image pairs and it crops them effectively. Here is the relevant code:
def find_corners(image):
# Finds the four extreme corners of the valid image region (non-black pixels).
coords = np.column_stack(np.where(np.any(image > 0, axis=2)))
top_left = coords[np.argmin(np.sum(coords, axis=1))]
bot_left = coords[np.argmax(coords[:, 0] - coords[:, 1])]
bot_right = coords[np.argmax(np.sum(coords, axis=1))]
top_right = coords[np.argmax(coords[:, 1] - coords[:, 0])]
return top_left, bot_left, bot_right, top_right
def get_overlap_region(imageL, imageR):
#Compute the largest overlapping area after alignment.
left_corners = find_corners(imageL)
right_corners = find_corners(imageR)
left_TL, left_BL, left_BR, left_TR = left_corners
right_TL, right_BL, right_BR, right_TR = right_corners
top_limit = max(left_TL[0], left_TR[0], right_TL[0], right_TR[0])
bot_limit = min(left_BL[0], left_BR[0], right_BL[0], right_BR[0])
left_limit = max(left_TL[1], left_BL[1], right_TL[1], right_BL[1])
right_limit = min(left_TR[1], left_BR[1], right_TR[1], right_BR[1])
return imageL[top_limit:bot_limit, left_limit:right_limit], imageR[top_limit:bot_limit, left_limit:right_limit]
-----
imageLaligned, imageRaligned = find_alignment(imageL, imageR)
imageLcropped, imageRcropped = get_overlap_region(imageLaligned, imageRaligned)
cropH = min(imageLcropped.shape[0], imageRcropped.shape[0])
cropW = min(imageLcropped.shape[1], imageRcropped.shape[1])
imageLcropped = imageLcropped[:cropH, :cropW]
imageRcropped = imageRcropped[:cropH, :cropW]
@LMC by cropping, I meant, cutting a fragment of an image and saving as another file. So, for example
img[500:1500, 500:1500]
would give an image from 500th to 1500th pixel "vertically" and from 500th to 1500th pixel "horizontally".
@Konstantin Makarov there are two issues with your codes (both not working for me):
@etauger I got numpy. Your code doesnt work for me. It gives an error for the last line of code (saving file). The error is below:
"C:\Program Files\Python\Python313\python.exe" D:\praca\GUMED\serce\testy2.py
Traceback (most recent call last):
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\tag.py", line 29, in tag_in_exception
yield
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 826, in write_dataset
write_data_element(fp, get_item(tag), dataset_encoding)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 686, in write_data_element
raise ValueError(
...<3 lines>...
)
ValueError: The (7FE0,0010) 'Pixel Data' element value hasn't been encapsulated as required for a compressed transfer syntax - see pydicom.encaps.encapsulate() for more information
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\praca\GUMED\serce\testy2.py", line 10, in <module>
ds.save_as(r"D:\praca\GUMED\dicom\mrxs\1_AORTA\AO_1_014_Masson\3_0-test-cropping.dcm")
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\dataset.py", line 2642, in save_as
pydicom.dcmwrite(
~~~~~~~~~~~~~~~~^
filename,
^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
)
^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 1455, in dcmwrite
write_dataset(fp, dataset)
~~~~~~~~~~~~~^^^^^^^^^^^^^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 825, in write_dataset
with tag_in_exception(tag):
~~~~~~~~~~~~~~~~^^^^^
File "C:\Program Files\Python\Python313\Lib\contextlib.py", line 162, in __exit__
self.gen.throw(value)
~~~~~~~~~~~~~~^^^^^^^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\tag.py", line 33, in tag_in_exception
raise type(exc)(msg) from exc
ValueError: With tag (7FE0,0010) got exception: The (7FE0,0010) 'Pixel Data' element value hasn't been encapsulated as required for a compressed transfer syntax - see pydicom.encaps.encapsulate() for more information
Traceback (most recent call last):
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\tag.py", line 29, in tag_in_exception
yield
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 826, in write_dataset
write_data_element(fp, get_item(tag), dataset_encoding)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\marci\AppData\Roaming\Python\Python313\site-packages\pydicom\filewriter.py", line 686, in write_data_element
raise ValueError(
...<3 lines>...
)
ValueError: The (7FE0,0010) 'Pixel Data' element value hasn't been encapsulated as required for a compressed transfer syntax - see pydicom.encaps.encapsulate() for more information
Process finished with exit code 1
In addition to InSync's solution, you can also do this:
from typing import Self
class A:
B: 'A'
A.B = A()
This is an example of a forward reference, and mypy handles those by putting the type name in quotes.
Im still facing the same error and I've followed all the suggested answers, and no avail. I've added the PHP version to my linter.yml file which matches my composer version, and Im still getting the same error. Any help is appreciated!
2025-03-26 18:40:10 [FATAL] Failed to run composer install for /github/workspace. Output: Your lock file does not contain a compatible set of packages. Please run composer update.
you can give index to them and put them in a list or array. is you cross first object increment index counter and get other object.
And if all else fails, try a different cable. I ended up using my Gopro cable instead, and it worked right away.
It should be
encoded_key = file("${path.module}/public_key.pem")
not
encoded_key = filebase64("${path.module}/public_key.pem")
You might encounter the 1603 error. You can fix it by running CMD as an administrator and then using the command: winget install -e --id Memurai.MemuraiDeveloper
. Alternatively, navigate to the directory containing the MSI file and run msiexec /i <filename>.msi
.
OJDBC Extensions: https://github.com/oracle/ojdbc-extensions/tree/main Help you set your credentials in Azure Key Vault with no code change (though you need to add the jar dependencies). https://blogs.oracle.com/developers/post/jdbc-config-providers
You can make another project and uncheck the box in the bottom where was "module.info.java", for me, it works fine without one.
Thanks m,ate, worked for me, also! :-)
Based on @fenton solution, but possible to do Object.keys(Color)
export enum Color {
Green = "GREEN",
Red = "RED",
}
export namespace Color {
export declare const values: readonly Color[];
export declare function stringify(mode: Color): string;
}
Object.setPrototypeOf(Color, {
values: Object.values(Color),
stringify: (color: Color): string => {
switch (color) {
case Color.Green:
return "Зеленый";
case Color.Red:
return "Красный";
}
},
});
i am facing another type of problem
Thank you very much for your feedback and help. I now know that this website is not for beginner Programmers. My brain is struggling to wrap around the coding concept. Thankfully, I have found some beginner Python forums and I will direct my questions there. Secondly, yes I agree that Zybooks is awful. We are literally unable to delete or change existing code as the program will not allow you too. It can be frustrating when asking for help or using tutoring as they often point out the errors.
Lastly, with your help I found the problem. Here is the corrected code with 5/5 tests passed.
string = input()
valid = True
i = 0
while valid and i < len(string) :
# To be valid, the string can only contain digits
# and a sign character (+ or -) as the first character.
if i == 0:
if not (string[i].isdigit() or string[i] in ['+','-']) :
valid = False
else :
if not string[i].isdigit() :
valid = False
i += 1
if valid :
print("valid integer")
else :
print("invalid integer")
Ran into this as well; I believe this is a unique intersection of PHP 8.0 with versions of Xdebug >= 3.3.0. Downgrading to 3.2.x worked, however PHP 8.0 is of course end of life. Running a supported version of PHP along with a current/supported version of Xdebug is likely to resolve this more definitively.
Read elsewhere this is potentially related to https://bugs.xdebug.org/view.php?id=2222 so this might still be an issue, but for sure this is a version mismatch issue with Xdebug and language features utilized in the entity proxy.
If you want to host your own Reflex web app on a Digital Ocean droplet, you can follow the instructions:
You can use Grant-AzDiskAccess and look for the AccessSAS property to download the disk without copying in to an storage account.
Can you show the routes using the php bin/console debug:router command?
On top of jareon's answer, you could also go to Clients > {your-client}. On Authentication flow you should disable the Direct access grants. This will disable the password grant type which should not be used and will be removed on OAuth 2.1 specification.
Yes, Reflex
is fully built on web-socket communication from the Next.js frontend to the FastAPI backend.
It's explained more [in their docs](https://reflex.dev/docs/advanced-onboarding/how-reflex-works/)
The solution mentioned in this Postman community forum has worked for me.
Under the Advanced
section of Token Configuration in Postman, go to the Token Request
section, add a new Key
called Origin
and use the value as https://oauth.pstmn.io/v1/callback
, assuming you have selected Authorize using Browser
Please follow these procedure as stated on this documentation.
https://neo4j.com/docs/operations-manual/2025.02/configuration/connectors/
or this SO answer can also help you:
I fixed the issue by using CMD instead of Powershell.
Perhaps not a complete answer, but I think you need to re-examine your model logic a bit.
Two things to suggest:
In your result, you are concluding that vehicles disappear at Z1 and then new vehicles are produced there. How do you know they are different? (It is a rhetorical question... ;) ). You don't. With the model construct you have it is impossible to determine if a vehicle passes through a Z node or is replaced by a new vehicle. So, for all of the Z nodes, you need to add an artificial/synthetic adjacent node that is a source/sink to handle that. So this:
A --- Z1 --- N1 --- N2 --- B
needs to be augmented to this:
A --- Z1 --- N1 --- N2 ---B
|
S1
And then you do normal conservation of flow at Z and track in/out at S
Before you do that, however, you should re-examine the logic of your model. Right now, because you are minimizing overall flow, you are at high risk of just making cars "go away" at Z nodes and having them appear at other Z nodes, because that is a lower OBJ value. What do you think you'd get with this model, with some sourcing at A and demand at B:
A --- Z1 --- N1 --- N2 --- N3 --- N4 --- Z2 --- B
I think you'd have ZERO flow at all of the N nodes. Think about it / mock it up with your data.You probably want to weight flows from Z to your new S nodes smartly such that they are modestly more expensive than any other shortest path connection.
Lastly, make a smaller model to test with. It is much easier to troubleshoot. After you are confident that it is working, then step up to the larger dataset.
FortniteBattlePass+LebronEdits+LunchlyIsMyFavorite
In Rive App we should explicitly emit events in order to view it in runtime. There are actually elements (next to joysticks) called 'event' that triggers due to conditions in State Machine logic.
As an alternative, I came across this API: https://reccobeats.com/docs/apis/extract-audio-features. It doesn't provide audio features directly but extracts them from an audio sample.
On the Solution Explorer, right click on the folder name and select CMake Workspace Settings. Change enable CMake to true and save, and CMake should work again.
While authentication with the JSON Web Token is still not available (the TCP transport is not recommended now and HTTP is preferred), you can ingest with Basic Auth into Telegraf Ingest data into QuestDB via Telegraf with autthentication
Multiple ways exist.
Fast, easy and expensive:
Open the virtual machine in the Azure Portal
Navigate to Monitoring/Insights and enable it
Wait for everything to be set up and the first data to be propagated (5 minutes should be good). Refresh the page and you should see it.
Fast, somewhat complicated, way less expensive:
Do the steps from "Fast, easy and expensive", but when you do it, make sure to note the "Data collection rule"
Navigate to the previously noted Data collection rule
Below Configuration/Data sources select Performance Counters
Select Custom and enter your counters. Replace X here with the Diskletters you need. If a letter does not exist on a VM, it's ignored. Don't forget to click Add
Useful counters for Disk utilization are (X again placeholder):
\LogicalDisk(X:)\Free Megabytes
\LogicalDisk(X:)\% Free Space
Click on Add destination and add a new Destination, remove the existing one
Click on save
Navigate back to the virtual machine where you started to and open Monitoring/Metrics and adjust the Metric Namespace. There you can find your Metrics:
P.S.: To get a list of all counters available on your machine, run typeperf -q in a Powershell, run typeperf -q -? to get help on the command.
You could use the method on Color
:
.toARGB32()
Well when you're using FormData to submit your receipt with images, the request.body is getting processed differently than with regular JSON payloads, causing your permission guard to not properly access the employee credentials.
The main problem is that when using FileFieldsInterceptor
or any file upload interceptors, the form data fields are parsed differently. Your guard is trying to destructure employeeCode
and employeePassword
directly from request.body
, but with multipart/form-data, these might be coming in as strings rather than as part of a JSON object.
import {
CanActivate,
ExecutionContext,
ForbiddenException,
Injectable,
UnauthorizedException,
} from '@nestjs/common'
import { PrismaService } from '../prisma/prisma.service'
import { PermissionEnum } from '@prisma/client'
import { Reflector } from '@nestjs/core'
import { compare } from 'bcryptjs'
@Injectable()
export class PermissionGuard implements CanActivate {
constructor(
private prisma: PrismaService,
private reflector: Reflector,
) {}
async canActivate(context: ExecutionContext): Promise<boolean> {
const requiredPermission = this.reflector.get<PermissionEnum>(
'permission',
context.getHandler(),
)
if (!requiredPermission) {
return true
}
const request = context.switchToHttp().getRequest()
const tokenId = request.user?.sub
const isCompany = request.user?.pharmacy
// Handle both JSON and FormData formats
let employeeCode, employeePassword
if (request.body) {
// Handle FormData - values will be strings
employeeCode = request.body.employeeCode || request.body.employee_code
employeePassword = request.body.employeePassword || request.body.employee_password
}
if (!tokenId) {
throw new UnauthorizedException('User not authenticated')
}
let permissions: PermissionEnum[] = []
if (isCompany) {
// If company login, we need employee validation
if (!employeeCode || !employeePassword) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'Employee credentials required',
})
}
const company = await this.prisma.company.findFirst({
where: { id: tokenId },
include: {
employees: true,
},
})
if (!company) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'Farmácia não encontrada',
})
}
const employee = company.employees.find(
(employee) => employee.code === employeeCode,
)
if (!employee) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'Funcionário não encontrado',
})
}
const isPasswordValid = await compare(employeePassword, employee.password)
if (!isPasswordValid) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'Credenciais incorretas',
})
}
permissions = employee.permissions
} else {
const user = await this.prisma.user.findFirst({
where: {
id: tokenId,
},
})
if (!user) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'User not found',
})
}
const pharmacy = user?.pharmacies[0]?.pharmacy
if (!pharmacy) {
throw new UnauthorizedException({
statusText: 'unauthorized',
message: 'Company not encontrada',
})
}
permissions = user.pharmaceutical.permissions
}
const hasPermission = permissions.some(
(perm) => perm === requiredPermission,
)
if (!hasPermission) {
throw new ForbiddenException(`Does not have the required permission: ${requiredPermission}`)
}
return true
}
}
Key changes I made to fix your issue:
More flexible field parsing: The updated guard now checks for different possible field names (employeeCode
/employee_code
) since form fields are sometimes sent with underscores.
Null checking: Added validation to ensure the employee credentials are present when company login is detected.
Better error handling: More descriptive error messages to help debug authentication issues.
Safe property access: Added optional chaining in the user pharmacy access to avoid potential undefined errors.
If you're still having issues, you could also consider implementing a custom middleware specifically for handling employee authentication in FormData requests, which would run before your guard and populate request.body with the parsed credentials.
Whereas i need an output as show below. Any suggestions on the fastest method, without using loop?
in_column out_column
0 5 1
1 5 2
2 5 3
3 8 1
4 13 1
5 13 2
6 13 3
7 13 4
8 13 5
I managed to make it work using 64Bit version of msbuild as this would address the x64 tools.
"C:\Program Files\Microsoft Visual Studio\2022\BuildTools\MSBuild\Current\Bin\amd64\MSBuild.exe" "C:\Path\To\YourSolution.sln" /p:Configuration=Release /p:Platform=x64
You are right, React Admin does bring MUI in it's own dependencies.
However I believe package managers that use PnP (namely PNPM) are more strict regarding the dependencies: if you import from, say, @mui/material
, directly in your own code, then you need to add an explicit dependency on @mui/material
. The transitive dependency through react-admin
is no longer sufficient.
Also, React Admin v5.6.4 included a fix to improve compatibility with some package managers, namely PNPM.
Does using this version help fix your issue?
Thanks to Friede, I've made a function to calculate the distance of many points along a line. Posting here in case this is useful for future reference. https://gist.github.com/wpetry/bb85a1ec3c408b2ab5dae17bd1e7771c
v_distance_along <- function(points, line, dist_unit = "km") {
# packages
require(sf)
require(sfnetworks)
require(dplyr)
require(units)
# check inputs
if (!inherits(points, "sf") && !inherits(points, "sfc")) {
stop("'points' must be an sf or sfc object containing POINT or MULTIPOINT geometries.")
}
if (!inherits(line, "sf") && !inherits(line, "sfc")) {
stop("'line' must be an sf or sfc object containing a LINESTRING or MULTILINESTRING geometry.")
}
if (!all(st_geometry_type(points) %in% c("POINT", "MULTIPOINT"))) {
stop("The second argument must be POINT or MULTIPOINT geometries.")
}
if (!st_geometry_type(line) %in% c("LINESTRING", "MULTILINESTRING")) {
stop("The first argument must be a LINESTRING or MULTILINESTRING.")
}
if (is.na(sf::st_crs(points)) | is.na(sf::st_crs(line))) {
stop("Both 'points' and 'line' must have a defined coordinate reference system (see ?st_crs).")
}
if (sf::st_is_longlat(points) | sf::st_is_longlat(line)) {
stop("")
}
if (sf::st_crs(points) != sf::st_crs(line)) {
stop("'points' and 'line' must have the same coordinate reference system (see ?st_crs).")
}
if (!units::ud_are_convertible(dist_unit, "meter")) {
stop("'dist_unit' must be a valid unit of length.")
}
line <- sf::st_cast(line, "LINESTRING") # ensure single LINESTRING
path <- sfnetworks::as_sfnetwork(sf::st_as_sf(sf::st_sfc(line$geometry)),
directed = FALSE)
near <- sf::st_nearest_points(points, line)
snap <- suppressWarnings(sf::st_sfc(lapply(near, function(l) sf::st_cast(l, "POINT")),
crs = sf::st_crs(line)))
pathx <- sfnetworks::st_network_blend(path, snap) # add snapped points to network
dist <- pathx |>
sfnetworks::activate("edges") |>
sf::st_as_sf() |>
dplyr::mutate(length = sf::st_length(x)) |>
sf::st_drop_geometry() |>
dplyr::mutate(dist = round(cumsum(units::set_units(length - length[1], dist_unit,
mode = "standard")), 1),
from = dplyr::case_when( # re-order vertices, moving line end to last position
from == 1L ~ 1L,
from == 2L ~ max(from),
from >= 3L ~ from - 1L,
)) |>
dplyr::select(from, dist)
return(dist)
}
I am able to get AEST timezone date time with below approach:
let queryDate = result.values[1];
let queryDateObj = new Date(queryDate);
log.audit("queryDateObj",queryDateObj);
let ASTTimeZone = format.format({
value:queryDateObj,
type:format.Type.DATETIME,
timezone:format.Timezone.AUSTRALIA_SYDNEY
});