In my case, I had a similar error... but after increasing the amount of memory in racket from 128M to 500M and installing the complaining ".sty" file, it produced the PDF output
! LaTeX Error: File `mathabx.sty' not found.
Type X to quit or to proceed,
or enter new name. (Default extension: sty)
Enter file name:
! Emergency stop. <read *>
l.52 \packageWasysym ^^M
*** (cannot \read from terminal in nonstop modes)
Here is how much of TeX's memory you used:
8771 strings out of 475246
130799 string characters out of 5768754
516143 words of memory out of 5000000
31581 multiletter control sequences out of 15000+600000
558832 words of font info for 37 fonts, out of 8000000 for 9000
59 hyphenation exceptions out of 8191
75i,0n,79p,244b,38s stack positions out of 10000i,1000n,20000p,200000b,200000s
! ==> Fatal error occurred, no output PDF file produced!
. . ../usr/share/racket/pkgs/scribble-lib/scribble/private/run-pdflatex.rkt:19:0: run-pdflatex: got error exit code
===============
I found the package with:
apt-file update
apt-file search mathabx.sty
then:
apt install texlive-fonts-extra
:)
===============
Use a self-hosted Git service. For example, with Gitea, you can gather repository mirrors using multiple GitHub tokens (which you can later keep up to date, similar to GitHub forks).
With Gitea, you can either publish the packages using their API and configure HTTP access as well.
for transformers
, pip install transformers==4.6.1
works, no cargo
install, see this post.
I found the cause. It looks a trivial issue but not easy to realize.
First, I will explain each command why it works or doesn't work.
The following works because only SSH shell access is related, nothing to do with git.
ssh git@myserver
The following works because git does not use SSH.
git ls-remote http://ip:3000/user1/repo1.git
The following works because git+SSH loads "~/.ssh/id_ed25519" implicitly that I mistakenly thought the key should be "id_ed25519_repo1", in addition the key "id_ed25519" was configured via gitea web UI previously.
git ls-remote git@ip:user1/repo1.git
The following do not work because git+SSH loads "~/.ssh/id_ed25519_repo1" explicitly - the key was added to authorized_keys manually.
git ls-remote myserver:user1/repo1.git git ls-remote git@myserver:user1/repo1.git
So just adding a bare entry to the file like below, accessing "ssh user1@myserver" or "ssh user1@real-ip" will work well, but git+SSH absolutely does not work.
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIvf4l5RjqWL+kOnxpqhhGAIcIkWVSHqLbgkAzMAlYGm user1@domain
The reason is the missing a part that links SSH key to the git operations that explains why SSH auth is OK but git does not recognized the repo path. So the correct syntax to connect git to SSH should look like below:
command="/usr/local/bin/gitea --config=/etc/gitea/app.ini serv key-6",no-port-forwarding,no-X11-forwarding,no-user-rc,no-pty,restrict ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIvf4l5RjqWL+kOnxpqhhGAIcIkWVSHqLbgkAzMAlYGm user1@domain
It is quite long to manually edit, it'd better to let gitea adding that for us via web UI. But one issue appears, since the "command" comes in, the SSH shell access using "user1" becomes impossible. I don't know how to enable access via both git+SSH and SSH for the same user. My solution is to create a new key for pure SSH access or consider enable the PasswordAuthentication option.
Notes I want to share:
i have now written this sql..
with cte1 as (
SELECT
cmo.[CMID] as object_id
,cmo.[PCMID] as parent_object_id
,cmo.[VERSION] as object_version
,cmo.[CREATED] as created_datetime
,cmo.[MODIFIED] as modified_datetime
,cmo.[DISABLED] as disabled
,cmo.[CLASSID] as class_id
,cmc.name as class_description
,cmo.[DISPSEQ] as display_sequence
-- report name --
,CMOBJNAMES.NAME
-- self join to get parent_class_id
, cmo2.CLASSID as parent_class_id
-- parent_class_desription
, cmc2.NAME as parent_class_description
,cmobjnames2.name as parent_object_name
, cmref2.REFCMID as owner_id
, props33.name as owner_name
, props33.userid as owner_user_id
, props33.LASTLOGIN as owner_last_login
, props33.license as owner_license_code
FROM CMOBJECTS cmo
-- get classid description
left join CMCLASSES cmc on
cmo.CLASSID=cmc.CLASSID
-- get objectname
left join CMOBJNAMES on
cmo.cmid=CMOBJNAMES.cmid
and CMOBJNAMES.isdefault=1
left join [CMOBJECTS] cmo2 on
cmo.PCMID=cmo2.CMID
left join CMCLASSES cmc2 on
cmo2.CLASSID=cmc2.CLASSID
--get parent object name
left join CMOBJNAMES cmobjnames2 on
cmo.pcmid=cmobjnames2.cmid
--and cmobjnames2.LOCALEID=92
and cmobjnames2.isdefault=1
-- get ownerid of report
left join CMREFNOORD2 cmref2 on
cmo.CMID=cmref2.CMID
-- gte owner attributes
left join CMOBJPROPS33 props33 on
cmref2.REFCMID=props33.cmid
WHERE 1=1
--and (cmo.disabled=0
--or cmo.disabled is null
--)
and cmc.name = 'report'
)
select * from cte1
which returns this output.. (transposed into record format for easier viewing here)
I'm looking to add in when the reports werre accessed / run etc next to see if we can filter out any not used for a while. Does anyone know what tables i could use for this?
Thanks,
Rob.
Dataweave also supports newline-delimited json (ndjson) https://docs.mulesoft.com/dataweave/latest/dataweave-formats-ndjson
What you have here looks exactly like that. Before loading it, maybe try renaming the file ending to .ndjson.
Can you tell me how can you do security, challenger, mutual authentication before you send the encrypt APDU with INS 21 ( verify)
Made usable again by getting rid of /Users/<me>/Library/Application Support/JetBrains/IntelliJIdea2024.3/plugins/python/helpers-pro/bundled_stubs/django-stubs
There is an option to set the pivot point of the sprite similar to css "transform-origin".
In case we need a label to be on top of the object, the y should be negative, e.g. -0.35 on the screenshot below and up to -1:
sprite.center.set( 0, -1 );
Works for r146 (probably other versions as well)
From the docs:
nitro:build:public-assets
Called after copying public assets. Allows modifying public assets before Nitro server is built.
export default defineNuxtModule({
setup (options, nuxt) {
nuxt.hook('nitro:build:public-assets', async () => {
This is resolved now.
I modified the Excel class reading logic to include relationship information:
ent
I am embarrassed. The workbook had not been saved since I added ranges "DT_25" to "DT_30". Once saved the python code worked perfectly. Simple, simple oversight. @moken and @user202311 thank you for your help and suggestions.
@adden00 Is there any doc that shows how to build native lib libtun2socks.so ?
I think that you will need to explicitly set the auto_adjust parameter to false in order to (now) get the 'Adj Price' column.
df = yf.download('nvda', period="1d", auto_adjust=True)
Otherwise, you just get 'Close'.
I'm using Version: 0.2.51
I have the same problem. Where _textEdgeNgramS is working, _textNgramS isn't. Unfortunately documentation is rather incomplete on this as usual.
I think this is a BUG. In some cases you just need to have certain settings machine wide - like proxy-settings. Unfortunately Java ignores windows default settings. So, you need dedicated java settings like JAVA_TOOL_OPTIONS
this shouldn't be an error, this should be an info at best.
ref: https://community.sonarsource.com/t/java-tool-options-setting-in-azure-devops-causes-failure/7764
I do not know who edited my question as a sole javascript problem and underlined as this is not a django thing. And gave minus. I would be really pleased if I could give minus to who edited my question and deleted my code.
Here is the problem: 1 - Django escapes from empty form. I stored the “empty form” in a block with {% autoescape off %}. 2 - Changed form index to prefix and then my problem solved.
The height of .marketing
needs to be set to 100%
With the current code the iframe
is indeed going 100% height and width, but to the size of the .marketing
div
. If we increase its height to 100%
it resolves this issue
what i do is await Task.Run and call it without await inside. Like this:
return await Task.Run<IEnumerable<ViewsDataModel>?>(() => {
lock (context)
{
return context.ViewsDataModels.Where(o => o.Owner == userId).ToList();
}
});
Since Dec 12, 2023 there is a new feature to validate directly by string: https://github.com/google/uuid/commit/9ee7366e66c9ad96bab89139418a713dc584ae29
var string anyUUID = "elmo"
err := uuid.Validate(anyUUID) // will result in an error
Live example: https://go.dev/play/p/QIzW63S0Oda
This is very useful when it comes to testing:
assert.NoError(t, uuid.Validate(anyUUID))
In my case I was developing a custom module for drupal 9, when I encountered 'drush command terminated abnormally' when running a module disable or update db command via command line (git-bash). eg. drush pm-uninstall <my_module> -y --debug --verbose and it wouldn't give more info than that.
The error was eventually found by running the same command via the UI and checking the /var/log/apache2/error.log. When running on the command line, the commands go through drush, and the php interpreter and the log location is found with
php -i | grep error_log
this location had all my errors
Iam looking for solution for couple of weeks for the above-mentioned issue, Any help will be appreciable. Please suggest the best solution for this problem.
Go to verify_id_token()
function in auth.py
and change clock_skew_seconds=0
to clock_skew_seconds=60
. It is working fine for me.
did you manage to fix this problem?? Because i have the same issue!!! Some name of the settings are missing when i try to print and don't print all the settings that i see in the NVIDIA Control Panel in the Manage 3D Setting tab. I don't understand why :(
I have found a way to make it work in regional in europe, by adding this to my config:
CALLER_ID = "urn:botframework:azure"
OAUTH_URL = "https://europe.token.botframework.com/"
TO_CHANNEL_FROM_BOT_LOGIN_URL = f"https://login.microsoftonline.com/{APP_TENANTID}/oauth2/v2.0/token"
TO_CHANNEL_FROM_BOT_OAUTH_SCOPE = "https://api.botframework.com/.default"
TO_BOT_FROM_CHANNEL_TOKEN_ISSUER = "https://api.botframework.com"
TO_BOT_FROM_CHANNEL_OPENID_METADATA_URL = "https://login.botframework.com/v1/.well-known/openidconfiguration"
TO_BOT_FROM_EMULATOR_OPENID_METADATA_URL = "https://login.microsoftonline.com/common/v2.0/.well-known/openid-configuration"
VALIDATE_AUTHORITY = True
source:
check this package this is a very helpful cli tool to create folders for both react and next js with js and typescript https://www.npmjs.com/package/react-cli-builder
According discussing in community here, if you use UTF_8, ignore_above should set 32766 / 4 = 8191 since UTF-8 characters may occupy at most 4 bytes.
I disabled the scalebar the following way:
mapView.scalebar.enabled = false
But I am using Mapbox version "11.1.0"
David Foerster's answer was very instructive. Thanks for answer. Don't have rep to comment apparently (think I forgot my original account). But wanted to add the locale chosen must exist (and the given string must be valid within that locale.
E.g. I do not have en_US.UTF-8, so my output was
content-type:text/html; charset:utf-8
?EURo Dikaiopolis en agro estin
?EURo Dikaiopolis en agro estin
4 5 6
When I changed it to a proper locale for the system, I got the expected:
content-type:text/html; charset:utf-8
🕽€ο Δικαιοπολις εν αγρω εστιν
🕽€ο Δικαιοπολις εν αγρω εστιν
4
5
6
What you are trying to achieve is an "area" chart type, not "line" chart.
Try changing your script to this:
const dataPoints = [-10, 3, -5, -18, -10, 12, 8]
const discreteMarkers = dataPoints.map((value, index) => {
return {
shape: "circle",
size: 4,
seriesIndex: 0,
dataPointIndex: index,
fillColor: "#ffffff",
strokeWidth: 1,
};
});
var options = {
chart: {
height: 380,
type: "area",
foreColor: '#aaa',
zoom: {
type: 'x',
enabled: true,
autoScaleYaxis: true
},
},
series: [
{
name: "Series 1",
data: dataPoints
}
],
stroke: {
width: 5,
curve: "monotoneCubic"
},
plotOptions: {
line: {
colors: {
threshold: 0,
colorAboveThreshold: '#157446',
colorBelowThreshold: '#C13446',
},
},
},
markers: {
discrete: discreteMarkers
},
grid: {
borderColor: '#6D6D6D',
strokeDashArray: 3,
},
xaxis: {
categories: [
"01 Jan",
"02 Jan",
"03 Jan",
"04 Jan",
"05 Jan",
"06 Jan",
"07 Jan"
]
},
dataLabels: {
enabled: false
},
stroke: {
curve: 'smooth',
width: 2
},
fill: {
type: "solid",
colors: ["#E6F4EA" ]
},
};
var chart = new ApexCharts(document.querySelector("#chart"), options);
chart.render();
That would render this:
In the UPS API documentation, they say that the returned URL containing the label PDF, will be active for 24 hours.
https://developer.fedex.com/api/en-us/catalog/ship/v1/docs.html#operation/Create%20Shipment
So, after generate a new label, the recommended approach is to store the PDF file into some external service like S3 or other.
Since Python is an interpreted language, each enviroment links to an executable associated with the enviroment, which then interprets the Python code. Have you thought about using subprocess.run () to therefore start the matching executable with the code you want to run as a file parameter?
import subprocess
python_executable = f"{path_to_enviroment}/bin/python"
command = [python_executable, script_path]
result = subprocess.run(command, capture_output=True, text=True)
I use date and strings :
$fullYear = date("Y");
$century = substr($fullYear, 0, 2);
use intval if you need to calculate something
You can just declare a variable in your TS file based on the window ?
export class MyComponent {
diameter = window.clientWidth / 10;
}
This might save someone's day,
The error in our case was that the permissions for SSRS (SQL Server Reporting Service) were not enough, and I changed my Application Pool for IIS it was configured to ApplicationPoolIdentity. I changed my Application Pool to LocalSystem and it fixed it.
Also adding low-level try-catch helped me to identify the real error, as this error usually is not accurate and there's an underlying error
Thanks for your reply Ahmed! Can you maybe tell me what versions you were using? Nuxt version and so on?
The MultiControl Hub lets you manage multiple computers with one keyboard and mouse, offering seamless, lag-free switching without extra software. Ideal for professionals, gamers, and multitaskers, it saves desk space and boosts productivity.
Key Features:
Control multiple devices with one keyboard and mouse Plug-and-play—no software needed Instant, lag-free switching Reduces desk clutter Compatible with Windows, macOS, and Linux Enhances workflow and efficiency
In my case, I forgot to associate the variable with the project in Vercel. There you have a box that indicates that the variable is associated with XXXX project, by default it is not associated with any
did you managed to resolve this?
Did somebody resolve this thing?
This way your team can pull required/latest version from repo.
On the magit status page, type d for diff, r for range, and then enter "master" for the branch to diff with.
It will show all the diffs, the trick is, on the Magit-diff buffer press Shfit+Tab (which is bind to "magit-section-cycle-global") to collapse the sections and show only the file names.
For faster bulk inserts better to modify your data in a temporary table before inserting it into the final table. Then, INSERT ... SELECT; to insert everything at once.
This reduces extra work for the database and speeds things up. Chunk is usually only needed for extremely large datasets
should do the work =)
was testing on your example
with open(csv_file, "r") as file:
reader = csv.DictReader(file)
for row in reader:
print(row)
I found this, while trying to figuring out how to allow "Allow unauthenticated invocations" at: https://stackoverflow.com/a/78545216/5503408
And it works fine if we want to disable the authentication.
Here is solution or overview for your question, I hope it will useful for you...
public function register(Request $request)
{
// Validate form data
$request->validate([
'name' => ['required', 'regex:/^[\pL\s]+$/u', 'max:255'],
'email' => ['required', 'email', 'unique:users,email'],
'password' => [
'required',
'min:8',
'regex:/[A-Z]/', // At least one uppercase letter
'regex:/[a-z]/', // At least one lowercase letter
'regex:/[0-9]/', // At least one digit
'confirmed' // Match with password_confirmation
],
], [
'name.required' => 'The name field is required.',
'name.regex' => 'The name can only contain letters and spaces.',
'email.required' => 'The email field is required.',
'email.email' => 'Please provide a valid email address.',
'email.unique' => 'This email address is already registered.',
'password.required' => 'The password field is required.',
'password.min' => 'The password must be at least 8 characters long.',
'password.regex' => 'The password must contain at least one uppercase letter, one lowercase letter, and one digit.',
'password.confirmed' => 'The password confirmation does not match.',
]);
// Save user data
User::create([
'name' => $request->name,
'email' => $request->email,
'password' => bcrypt($request->password),
]);
return redirect()->route('login')->with('success', 'Registration successful!');
}
also you can put that validation code in request file create new request file for register form validation using following command
php artisan make:request RegisterRequest
And then add rules and message in their request file function, i suggest this way because you practice that way when code is optimised and also you can reuse in case of crud function
Your list should be formatted as follows:
{
"data": [
{
"date": "2022-12-13",
"symbol": "nsht",
"price": "45.12"
},
{
"date": "2022-12-13",
"symbol": "asdf",
"price": "45.14442"
}
]
}
tbm tive esse erro mas graças a resposta do amigo conseguir resolver
it seems your project structure has a root folder (package) so you need to import it as follows: from myProject.package.items import item
There are a couple of things that come to mind with regards to this error.
Verify that the Salesforce user account you're using has the "API Enabled" permission. This is required to connect through the JDBC driver.
If your IP address is not included in Salesforce's trusted IP ranges, the connection will fail unless you append the security token to the password. Make sure your current IP address is allowed under Setup > Security Controls > Network Access in Salesforce.
You might also need to modify you jdbc url to include AuthScheme:
jdbc:cdata:salesforce:AuthScheme=Basic;User=myUser;Password=myPassword;Security Token=myToken;
In case the error still persists you can add logging properties to your connection string by modifying it to:
jdbc:cdata:salesforce:AuthScheme=Basic;User=myUser;Password=myPassword;Security Token=myToken;Logfile=D:\\path\\to\\logfile.log;LogVerbosity=3;
Once the Logs are generated please navigate to the error message to get more detailed information on this.
Currently, Visual Studio does not support creating custom snippets for Razor, so something like or won't work. You can check the supported languages here: Code snippets schema reference.
Apparently, only built-in snippets work in .razor files. You can find a discussion and some suggestions to work around this limitation, such as editing built-in snippets or using the legacy editor, in this issue: 6397.
In AndroidManifest file
Step 1: Step 2: android:usesCleartextTraffic="true" (Important) Step 3: cd android ./gradlew clean
and rebuild your project, it will work's
Using Laravel Join
Using Laravel Relations
For large data sets, go with joins to optimize performance.
For medium to small data sets or when focusing on maintainable and readable code, use Eloquent relationships with eager loading.
Note : If you're dealing with extremely large data sets, consider using chunking or pagination with either approach to avoid memory exhaustion.
The issue has been discussed with the Spring team: https://github.com/spring-cloud/spring-cloud-stream/issues/3066
For SSO ,You can use Id token to verify using jwt package and use jwt.decode or go to official jwt.io site. With access token if you want to try ,then you need private and public keys.
In some cases it doesn't work if your phone is set to safe-battery mode. https://stackoverflow.com/a/71118394/24131641
I spotted an issue that did not solve the problem but is related, the dynamic template was missing the field match_pattern
setting the pattern to regex, the correct version follows:
fields: {
mapping: {
type: 'text'
},
match_mapping_type: 'string',
match_pattern: 'regex',
path_match: 'dict.*',
match: '^\\d{1,19}$'
}
in addition to this dynamic template correction I needed to introduce the following to my mapping.properties:
dict: { type: 'object' },
in my tests this accepts the digit fields and reject non-digit ones, solving the problem, but also accept empty dict, which is not ideal.
I have this same exact issue. Is driving me crazy.
use BB_ENV_PASSTHROUGH_ADDITIONS https://docs.yoctoproject.org/bitbake/bitbake-user-manual/bitbake-user-manual-metadata.html#passing-information-into-the-build-task-environment
Did you get any solution?? I am right now stuck at this problem.
I need help on this one too, bumping the ticket for assistance!
I confirm @vahagn's answer. Incase anyone is wondering, what is a smart banner
, it means you just have to add a meta
tag inside the <head></head>
tags like below:
<!DOCTYPE html>
<html lang="en">
<head>
<meta name="apple-itunes-app" content="app-id=1234567890, app-clip-bundle-id=com.example.myapp.clip">
<title>Your title</title>
...
</head>
<body>
...
</body>
</html>
The issue is with the file structure. I had to update the layout with sdk 52 and expo router.
I don't see exactly whats wrong with your code. How did you "create" the user? Where are you getting the logs from? Did you check if there are any users? User.all
in rails console.
Here is a short tutorial that has helped me. https://dev.to/casseylottman/adding-a-field-to-your-sign-up-form-with-devise-10i1
Split trues/falses then get the id's which exist in both group.
SQL Server;
select distinct t1.ID from (select ID from thetable where VALUE='false' ) t1,
(select ID from thetable where VALUE='true') t2
where t1.ID=t2.ID
I solved the issue by running Vs code as an administrator. I hope it should help someone (:
The draw and fill can take a Shape one of which is a Arc2D.Double.
By default, bots can only access messages in chats with the bot (the bot is member of), so what happened here that the bot is not member of the chat_id
in reply_parameters
.
The best solution is to use Trigger = "On save" and When updating? = "Created on". After creation can take up to 4 hours to be triggered
I want to do the samething with my app but currently I just found solution in SwiftUI, React Native I still have no clue
for the source of SwiftUI, you can find it here: https://github.com/metasidd/Prototype-Siri-Screen-Animation
You can implement it like this:
Effect screenshots enter image description here
let view = UITextView()
view.attributedText = testAttributedString()
return view
func testAttributedString() -> NSAttributedString {
let test = NSMutableAttributedString()
test.append(.init(string: "How"))
test.append("are".generateImage(.init(width: 60, height: 30)))
test.append(.init(string: "you"))
return test
}
extension String {
func generateImage(_ size: CGSize,
textFont: UIFont = .systemFont(ofSize: 16),
textColor: UIColor = .white,
fillColor: UIColor = .brown) -> NSAttributedString {
let format = UIGraphicsImageRendererFormat()
format.scale = UIScreen.main.scale
let render = UIGraphicsImageRenderer(size: size, format: format)
let image = render.image { context in
let ellipsePath = UIBezierPath(roundedRect: CGRect(origin: .zero, size: size), cornerRadius: size.height / 2).cgPath
context.cgContext.setFillColor(fillColor.cgColor)
context.cgContext.addPath(ellipsePath)
context.cgContext.fillPath()
let attributed = NSAttributedString(string: self, attributes: [.font: textFont, .foregroundColor: textColor])
let textSize = attributed.size()
attributed.draw(at: CGPoint(x: (size.width - textSize.width) / 2, y: (size.height - textSize.height) / 2))
}
let attachment = NSTextAttachment(data: nil, ofType: nil)
attachment.image = image
attachment.bounds = .init(x: 0, y: -9.3125, width: size.width, height: size.height)
attachment.lineLayoutPadding = 5
return .init(attachment: attachment)
}
}
I agree with rizzling about that::
"useSuspenseQuery()" block rendering until the data fetching, but useQuery() handle rendering and loading data at the same time.
you know that rendering time is limited to 60 sec:: so in first hook(useSuspenseQuery) the rendering won't start until data fetched (whatever time it takes), but in second hook(useQuery)the rendering will start immediately in parallel with fetching data.
let say that fetching data takes 90sec , if u use useSuspenseQuery :: you will not face any issue because the rendering will start after 90 sec if u user useQuery :: you will face the timeout error because you reach 60th sec and no data has been fetched yet ...
you need to revise your api performance and need to use(logging,monitoring tools) to see bottlenecks
If you want to use MessageBoxA you need to open c_cpp_properties.json / tasks.json file and add link to the User32.lib.
Here is the sample code:
{
"tasks": [
{
"type": "cppbuild",
"label": "C/C++: cl.exe build active file",
"command": "cl.exe",
"args": [
"/Zi",
"/EHsc",
"/nologo",
"/Fe${fileDirname}\\${fileBasenameNoExtension}.exe",
"${file}",
"/link",
"User32.lib"
],
"options": {
"cwd": "${fileDirname}"
},
"problemMatcher": [
"$msCompile"
],
"group": {
"kind": "build",
"isDefault": true
},
"detail": "Task generated by Debugger."
}
],
"version": "2.0.0"
}
Add tabindex="-1"
to your element, set this attribute to -1 can make this element unfoucsable.
So, you can add/remove this attribute dynamicly
To answer the second part of the question, if you have the non-standard 'rev' (reverse) command available, simply reverse the line, then cut from the nth column to the end, then reverse back. e.g. '... | rev | cut -d. -f 2- | rev'
So to combine this is the first part of your question you would cut the first 'n' columns before the first rev.
@Mike Macpherson Sorry for my response exactly one year later, buuuut.... Did you find any solution for your problem as I am facing the same?
I used the FFmpegMetaDataRetriever which helped me for some stream infos, maybe that could help you.
But I am still trying to find out how get more informations using the exoplayer..
sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker-compose up
Hello I found this blog helpful on validating the content of the file. However the logic to compare the signature workes for some known file type like jpeg,gud,doc,docx etc. The logic doesn't work for file types like txt,log,JSON. Is there any solution to validate the content type of txt,log,JSON files ?
My problem is that even this code is not working. The app will not change to any language with this code. If I try to run this
Ok i was able to resolve it using java 17 by changing some configurations.
I used scala 2.12.15 and updated sparkTestsVersion to 1.1.0 (this helped solve the ReflectiveOperationException)
As for the java options, I didnt find a good way of setting this in build.sbt, so I just added it as a step in git actions as following:
- name: Set JAVA_OPTS
if: ${{ inputs.JAVA_VERSION == '17' }}
run: echo "JAVA_OPTS=--add-exports=java.base/sun.nio.ch=ALL-UNNAMED" >> $GITHUB_ENV
According to Google, this is the solution
AppCompatDelegate.setApplicationLocales(LocaleListCompat.forLanguageTags(locale));
and use not the regional tag r. As in my example kq-GN
NB! This code is working from Android Tiramisu and up
My problem is that even this code is not working. The app will not change to any language with this code. If I try to run this
AppCompatDelegate.getApplicationLocales().toString());
I get this output []
I even get this result before I do the switch of language, so it seems that there is something wrong here.
MixPlayer (https://www.npmjs.com/package/mix-player) is your solution! It supports most of the common file formats (FLAC, MP3, Ogg, VOC, and WAV files) and has customizability of fade-ins, volume changing, seeking, looping, etc.
Heres an example snippet:
import { MixPlayer } from "MixPlayer";
MixPlayer.play("test_audio.mp3");
MixPlayer.onAudioEnd(() => {
console.log("Audio ended! Now what?");
});
await MixPlayer.wait();
process.exit(0);
You can construct a TikTok sharing URL for a specific piece of content. If you have a TikTok link to share, you can simply redirect the user:
https://www.tiktok.com/share/video?url=<your-content-url>
Replace with the URL of the content you want to share.
Actually you are using the wrong plugin. The right plugin for you will be "LottieFiles" here's a screenshot of the plugin. There are a ton of videos about this plugin on Youtube on how to use it. I'm sharing one here: https://www.youtube.com/watch?v=mtmYqqbpUVs
Additionally, you would want to use svg animations on the web rather than GIFs because they are vector graphic animations with 2 benefits. Tiny in size and scalable without getting pixelated. GIFs are almost outdated and obsolete for web in my opinion.
I encountered the same issue, and for me, it was related to how I structured the logic in my React component. Specifically, I had the Google login initialization, One Tap prompt display, and login button rendering all inside a single useEffect hook. Once I split the logic into separate useEffect hooks for each part, the One Tap modal started dismissing as expected—without needing any manual intervention.
Interestingly, I found that the issue of the modal not dismissing was only present in Chrome. The modal dismissed correctly in other browsers, but not in Chrome. Splitting the logic into separate hooks resolved the issue in Chrome as well.
2025:
For Google chrome, checkout the step here from the docs: https://developer.chrome.com/docs/devtools/overrides
Overriding the header content of the network resource does it for me.
0
I have resolve this issue by adding TestNG library from configure build path and updating testNG plugin
I'm using OpenSSL 3.3.1 4 Jun 2024 (Library: OpenSSL 3.3.1 4 Jun 2024), on Ubuntu 24.10.
I'm having similar issues, but here's two notes: 1) you are not specifying a
signer_digest
, either in the config file or via a-digest
command-line option; 2) we can't see your certificate information in order to assess whether they are well-formed.
And that was the comment I was about to post, when I tried a few more things and it started working.
Starting from the end, here's my config file, named x509.cnf
:
[ server ]
basicConstraints = CA:FALSE
extendedKeyUsage = critical, timeStamping
[ tsa ]
default_tsa = tsa_config
[ tsa_config ]
dir = .
serial = $dir/serial
crypto_device = builtin
signer_cert = $dir/ca-int.crt
signer_digest = SHA256
signer_key = $dir/ca-int.key
default_policy = 1.2.3.4.1
digests = sha256
accuracy = secs:1, millisecs:500, microsecs:100
ordering = yes
tsa_name = yes
Two things are immediately apparent:
default_policy
expects the actual value, and not a section name. I got this one from the error message:4027392CF87A0000:error:17800087:time stamp routines:ts_CONF_invalid:var bad value:../crypto/ts/ts_conf.c:120:tsa_config::default_policy
40473E889B7C0000:error:17800088:time stamp routines:ts_CONF_lookup_fail:cannot find config variable:../crypto/ts/ts_conf.c:115:tsa_config::signer_digest
so I added the line:
signer_digest = SHA256
Documentation states this is not optional, although it's non-existent as to actual values. Yeah, openssl
docs, right? Thank God the product is actually great.
Here's my steps:
LEN=${LEN:-2048}
# create a root.
openssl req -new -x509 -noenc -out ca.crt -keyout ca.key -set_serial 1 -subj /CN=CA_ROOT -newkey rsa:$LEN -sha512 || exit 1
# create TSA CSR
openssl req -new -noenc -config x509.cnf -reqexts server -out tsa.csr -keyout tsa.key -subj /CN=TSA -newkey rsa:$LEN -sha512 || exit 1
# Sign the TSA with `ca.crt`
openssl x509 -req -in tsa.csr -CAkey ca.key -CA ca.crt -days 20 -set_serial 10 -sha512 -out tsa.crt -copy_extensions copy || exit 1
As you can see, the ROOT is generated completely without a configuration and the TSA is then signed by the ROOT. The crucial point here is this line in your config:
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
which is precisely why you get something like:
4097C0FB27790000:error:17800075:time stamp routines:TS_RESP_CTX_set_signer_cert:invalid signer certificate purpose:../crypto/ts/ts_rsp_sign.c:142:
The only key usage of this certificate must be the timeStamping
, which, not being among the standard key usages, must be fed via an extended key usage extension. If this is as self-evident to you as it was to me, welcome to RFC HELL! By now, I know by heart larger swaths of RFC5280 than it's mentally healthy, and I still feel quite the ignorant.
So, remove the keyUsage
line from your cnf
and it should fly.
Just run:
openssl ts -reply -config x509.cnf -queryfile request.tsq
and admire the gibberish on your screen. Or add the -out response.tsr
and save it for later.
For me, issue was with Ad Blocker Browser Plugin, so I just turned off the plugin and issue resolved. :)
I have a similar issue. When our AWS build pipelines run cdk synth, the process downloads the public.ecr.aws/sam/build-python3.10 image and then runs the following command which now pulls in v2.0.0 of poetry which no longer has the required export option:
[2/2] RUN python -m venv /usr/app/venv && mkdir /tmp/pip-cache && chmod -R 777 /tmp/pip-cache && pip install --upgrade pip && mkdir /tmp/poetry-cache && chmod -R 777 /tmp/poetry-cache && pip install pipenv==2022.4.8 poetry && rm -rf /tmp/pip-cache/* /tmp/poetry-cache/*
After details analysis we are observing that at the time of issue in varnish, Server processes got increased and varnish giving incomplete request and returned to 504 to google load balancer. iam sharing below Google LB error and SAR command output --
{ "insertId": "1l6m956f37u7rz", "jsonPayload": { "@type": "type.googleapis.com/google.cloud.loadbalancing.type.LoadBalancerLogEntry", "backendTargetProjectNumber": "projects/488", "remoteIp": "106.197.5.134", "statusDetails": "client_disconnected_before_any_response", "cacheDecision": [ "CACHE_MODE_USE_ORIGIN_HEADERS" ] }, "httpRequest": { "requestMethod": "POST", "requestUrl": "https://abc/iry", "requestSize": "364", "userAgent": "Mozilla/5.0 (iPad; CPU OS 15_5 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/125.0.6422.80 Mobile/15E148 Safari/604.1", "remoteIp": "106.197.5.134", "referer": "https://xyz/efg/w34pro-smartwatch-23944871197.html?pos=2&kwd=smart%20watch&tags=A|PL|||8752.144|Price|product|||LSlc|rsf:pl-|-res:RC4|ktp:N0|stype:attr=1|mtp:G|grpfl:45|wc:2|qr_nm:gd|com-cf:nl|ptrs:na|mc:184363|cat:248|qry_typ:P|lang:en|flavl:10|cs:9555", "latency": "0.024914s" }, "resource": { "type": "http_load_balancer", "labels": { "zone": "global", "forwarding_rule_name": "-logical-seperation-443lb", "target_proxy_name": "logical-speration-lb-target-proxy-2", "backend_service_name": "varnish-group3", "url_map_name": "logical-speration-lb", "project_id": "abc" } }, "timestamp": "2025-01-08T03:32:37.468854Z", "severity": "INFO", "logName": "projects/987/logs/requests",
} ------------output of SAR command --when the process increases from 1500 to 3k and 4k issue error started, coming--
03:00:07 IST 4 1543 1.42 1.52 1.55 1 03:01:07 IST 2 1547 1.31 1.48 1.53 1
03:01:07 IST runq-sz plist-sz ldavg-1 ldavg-5 ldavg-15 blocked 03:02:06 IST 4 2044 1.65 1.55 1.55 0 03:03:06 IST 1 4079 1.38 1.49 1.53 0 03:04:06 IST 1 4224 1.67 1.54 1.55 0 03:05:06 IST 2 4228 1.69 1.58 1.56 1 03:06:06 IST 1 4223 1.43 1.53 1.54 2 03:07:06 IST 1 4208 1.60 1.57 1.56 0 03:08:06 IST 1 4196 1.54 1.54 1.55 0 03:09:06 IST 1 4063 1.66 1.58 1.56 0 03:10:06 IST 1 3822 1.58 1.58 1.56 0 03:11:06 IST 1 3592 1.56 1.55 1.55 0 03:12:06 IST 2 3349 1.24 1.46 1.52 0 03:13:06 IST 1 3098 1.29 1.44 1.50 0 03:14:06 IST 1 2863 1.41 1.46 1.51 0 03:15:06 IST 1 2618 1.36 1.43 1.50 0 03:16:06 IST 1 2391 1.85 1.57 1.54 0 03:17:06 IST 2 2147 1.52 1.53 1.53 0
==Sharing log below == 20241218211542 - - - 0 - Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.6778.108 Mobile Safari/537.36 (compatible; Googlebot/2.1; )
FROM signal si LEFT JOIN block b ON prov.block_id= b.id AND si.signal_id = b.id AND si.type = 'BLOCK'
On this level your prov.block_id is not visible. You add it later
GitHub reusable workflows inputs
and secrets
are defined and passed separately and this is why we can't pass secrets as a build arguments values.
However, can workaround this in the following way
ARG_ONE
.build_args: |
ARG_ONE=${ARG_ONE}
ARG_TWO=ARG_TWO_plain_text
${ARG_ONE}
wit the value of secret ARG_ONE
.build_args
variable with the substituted value and pass it as a multiline.build_args
as usual to docker/build-push-action action.Substituted variable value in build_args
will be masked as a regular secret.
name: Docker
on:
workflow_dispatch:
jobs:
build-and-push:
name: Build and Push
uses: org/repo/.github/workflows/docker-reusable.yml@main
with:
docker_file: docker/Dockerfile
build_args: |
ARG_ONE=${ARG_ONE}
ARG_TWO=ARG_TWO_plain_text
secrets: inherit
name: Docker reusable workflow
on:
workflow_call:
inputs:
docker_file:
default: Dockerfile
description: Dockerfile
required: false
type: string
build_args:
default: ''
description: Build arguments
required: false
type: string
env:
DOCKER_FILE: ${{ inputs.docker_file }}
BUILD_ARGS: ${{ inputs.build_args }}
jobs:
build:
name: Build and push
runs-on: ubuntu-latest
steps:
- name: Secrets to variables
if: ${{ env.BUILD_ARGS != '' }}
uses: oNaiPs/[email protected]
with:
secrets: ${{ toJSON(secrets) }}
exclude: DOCKERHUB*
- name: Substitute build args
if: ${{ env.BUILD_ARGS != '' }}
run: |
{
echo 'BUILD_ARGS<<EOF'
echo "${{ env.BUILD_ARGS }}"
echo EOF
} >> "$GITHUB_ENV"
- name: Build and Push by digest
id: build
uses: docker/build-push-action@v6
with:
context: .
file: ${{ env.DOCKER_FILE }}
platforms: linux/amd64,linux/arm64
push: true
build-args: |
${{ env.BUILD_ARGS }}
labels: ${{ steps.meta.outputs.labels }}
This partial example is based on Build and load multi-platform images from Examples.
We added two optional steps, which will be executed only when build_args
input is passed and we use oNaiPs/secrets-to-env-action to expose secrets as variables.
secrets: inherit
.In addition to OP's answer, ensure that your PATH environment variable includes %HADOOP_HOME\bin (for Windows) - else downloading the correct winutils version won't work.
Ok after talking to @pskink comment I reached the solution
void initState() {
super.initState();
// other code
SchedulerBinding.instance.addPostFrameCallback((timeStamp) {
context.findRenderObject()?.visitChildren(_visitor);
});
}
void _visitor(RenderObject child) {
if (child is RenderEditable) {
setState(() {
// assign RenderEditable node to widget state
// make sure you get the correct child, for me there is only one textfield for testing
reEdt = child;
});
return;
}
child.visitChildren(_visitor);
}
// call when inserting text and want to scroll to cursor
void scrollToSelection(TextSelection selection) {
// find local rect of cursor or starting selection in case of selecting text
final localRect = reEdt?.getLocalRectForCaret(TextPosition(offset: selection.baseOffset));
if (localRect == null) return;
scrollController.jumpTo(localRect.top);
}
and don't forget to assign scrollController
to TextField
For me this part would not be correct:
output_shape = ((A.shape[0] - kernel_size) // stride + 1,
(A.shape[1] - kernel_size) // stride + 1)
Incase A.shape = [5, 5], kerne_size_ = 3, stride = 2 it would give output_shape = 2 but the result should be output_shape = 3. In my opinion the correct expression should be:
output_shape = ceil(((A.shape[0] - kernel_size) / stride + 1,
(A.shape[1] - kernel_size) / stride + 1)
Regards.
yes, this works for stocks, how can I get data for NIFTY AND BANKNIFTY for all columns and send excel output to output folder?
What should I write stock on fno
api_req=req.get('https://www.nseindia.com/api/quote-derivative?symbol=NIFTY',headers = headers).json()
for item in api_req['**stocks**']:
data.append([
item['metadata']['instrumentType'],
item['metadata']['openPrice']])
I had a similar problem updating a Node.js project to use moduleResolution
of node16
.
Removing an old paths
section that explicitly forced the TS compiler to look in a specific location was the solution.
"paths": {
"*": [
"node_modules/*"
]
}
People are usually mixing up two things. [C# 3.0 & .NET Framework 3.5]
as people describe it previously and provide tables for each c# language version with its compatible visual studio and .net framework.. ::C# 3.0 was the version of the programming language released in 2007, and it came with awesome features like LINQ(to treat with data)
At the same time, there was an update called .NET Framework 3.5 which added tools for Developers.
the confusion because of the .NET Framework 3.5 was released around the same time (after c# 3.0 ) so Some people Mistakly think of it is an C# 3.5, but actually c#3.5 doesn't exist.
It’s just C# 3.0 with extra tools from .NET Framework 3.5.