Did you managed to solve this problem? Currently I am facing the same issue.
Taking a lead from @Jeffrey's comment this is how you can calculate the Unix timestamp.
= ( A1 - DATE(1970,1,1) ) * 86400
The reason is datetime values in Google Sheets, Excel and other spreadsheets have an epoch of 30th December 1899 whereas Unix epoch is 1st of Jan 1970. There's a bit of tech history around this: https://stackoverflow.com/a/73505120/2294865
Remember datetime/timestamp value is generally interpreted naively as UTC timezone so be aware of this when converting to/from date & time which typically take on local timezones and daylight savings adjustments.
const str = "ramesh-123-india";
const result = str.match(/\w+$/)[0];
console.log(result);
Unlike GPT-based models, Open Llama's temperature handling can vary based on implementation and may have a different effect on probability scaling. If temperature changes don't seem to work, you might also need to adjust top-k or top-p parameters alongside it.
For a better way to tune responses dynamically, consider DoCoreAI, which adapts intelligence parameters beyond just temperature, helping generate more fine-tuned and predictable outputs across different models like Open Llama.
📌 More details on dynamic intelligence profiling: DoCoreAI Overview.
These days there's also brightnessctl: https://github.com/Hummer12007/brightnessctl
It is available on many distributions, and works directly through sysfs (therefore does not need an xorg.conf file like xbacklight does for intel_backlight).
It sets up udev rules and requires the user to be in "video" group to control brightness.
I have followed the given steps and while trying to create an environment after this, I am getting SSL Certification issue
Exception: HTTPSConnectionPool(host='repo.anaconda.com', port=443): Max retries exceeded with url: /pkgs/main/win-64/repodata.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)')))
How can I ignore the SSL certificate?
I got the same error in a glTF model viewer and it turns out to be a bug in Chrome's ImageBitmap from Blob's API as far as I can understand the issue: https://issues.chromium.org/issues/404044460
(Maybe GPU driver related)
Same Problem. You can use a composition association and change it into aggregation, like WC Chou said, but I don't know why it turns again in a composition association after a few minutes. It doesn't make sense.
https://github.com/pjfanning/excel-streaming-reader
This library solved my problem.
Possibly related to - https://github.com/spring-cloud/spring-cloud-netflix/pull/4394/files
Fixed in 2024.0.1, try setting RequestConfig
The solutions from 2022 are not working anymore. Does anybody have a new solution how to still get the same output table? Thank you very much in advance!
How to disable easy auth for specific routes in Flask app deployed to Azure?
To disable Easy Auth for specific routes in Azure, use a file-based configuration.
I followed this MS Doc to Enable file-based Authentication in Azure App Service.
I created an auth.json
file, excluding the public routes and including the private routes.
auth.json:
{
"platform": {
"enabled": true
},
"globalValidation": {
"unauthenticatedClientAction": "RedirectToLoginPage",
"redirectToProvider": "AzureActiveDirectory",
"excludedPaths": [
"/api/public"
]
},
"httpSettings": {
"requireHttps": true,
"routes": {
"apiPrefix": "/api"
},
"forwardProxy": {
"convention": "NoProxy"
}
},
"login": {
"routes": {
"logoutEndpoint": "/.auth/logout"
},
"tokenStore": {
"enabled": true,
"tokenRefreshExtensionHours": 12
},
"allowedExternalRedirectUrls": [
"https://<AzureWebAppName>.azurewebsites.net/"
],
"cookieExpiration": {
"convention": "FixedTime",
"timeToExpiration": "00:30:00"
}
},
"identityProviders": {
"azureActiveDirectory": {
"enabled": true,
"registration": {
"openIdIssuer": "https://login.microsoftonline.com/<YOUR_TENANT_ID>/v2.0",
"clientId": "<YOUR_CLIENT_ID>",
"clientSecretSettingName": "APP_SETTING_CONTAINING_AAD_SECRET"
},
"login": {
"loginParameters": [
"scope=openid profile email"
]
},
"validation": {
"allowedAudiences": [
"api://<YOUR_CLIENT_ID>"
]
}
}
}
}
I added the auth.json
file to the /home/site/wwwroot/
path in Azure using the Kudu Console via the below URL.
https://<AzureWebAppName>.scm.canadacentral-01.azurewebsites.net/newui
I created a file and save it as authsettingsV2.json:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"resources": [
{
"type": "Microsoft.Web/sites/config",
"apiVersion": "2022-03-01",
"name": "[concat(parameters('webAppName'), '/authsettingsV2')]",
"properties": {
"platform": {
"enabled": true,
"configFilePath": "auth.json"
}
}
}
],
"parameters": {
"webAppName": {
"type": "string"
}
}
}
I ran the below commands to create an ARM template for enabling file-based authentication.
az login
az account set --subscription "SubscriptionId"
az deployment group create --resource-group <ResourceGroupName> --template-file <PathTOauthsettingsV2.json> --parameters webAppName=<AzureWebAppName>
After running above commands File-Based configuration is enabled as shown below:
Make Sure to Below Values are set in the Environment Variables section of Azure Web App and add client secret.
APP_SETTING_CONTAINING_AAD_SECRET:clientsecret
Change the redirect URL in the App Registration as shown below:
https://<AzureWebAppName>.canadacentral-01.azurewebsites.net/api/login/aad/callback
Azure Output public Route:
Protected Route:
I have the same issue. I don't think using variables in targets is currently supported.
crontab -e //edit
service cron restart //restart cron job
service cron status //status
According to your regex, you must avoid spaces between " : "
Test this record: {"secret-key":"1234"}
or update the regex by '"(secret-key)"\s*:\s*".*"
I would strongly recommend the use of "SETLOCAL" and "ENDLOCAL" in any bat-script that is to be called from another bat-script.
Sales
to a plain odd List<DetalleMensual
(needs to raise property changed in property setter, if not already done).Sales
just once (calling Add
on a ObservableCollection
might add unnecessary layout cycles)You could set the environment variable PIPX_DEFAULT_PYTHON
to use python3.11 (use pipx environment
to list the available environment variables)
e.g on macOS on Apple silicon
export PIPX_DEFAULT_PYTHON=/opt/homebrew/bin/python3.11
Could you please share the versions you're currently using for Maps and React Native?
Hello google team please my number add to Karo dusre number par Karna Hai as Ki sari calls mere number sunai dy ok
I also have a question. Could anyone help?
why doesn' t my map show the tickes of lat and longitude?
ggplot()+geom_sf(shp)+coord_sf( crs=4236)
should I need to add anything else?
You need to use jq's map function. Please try below code
jq 'map(if .user == "user-2" then .videos += [{"key":"key3","url":"url3"}] else . end)' input.json
A PAX counter can only ever be an estimate, it can never provide an exact figure. This is due to the technologies used, which have gone to great lengths in recent years to ensure that individual devices are not traceable and clearly identifiable.
If you are using Gradle, go to Settings | Build, Execution, Deployment | Build Tools | Gradle, and in "Run tests using" select IntelliJ IDEA
I just downgraded rapier to lower version and it also worked
{'groups': 1, 'kernel_initializer': {'class_name': 'GlorotUniform', 'config': {'seed': None}}, 'kernel_regularizer': None, 'kernel_constraint': None}
This, at least from my experience would be easier that you create the same svg with single line, not shape but single line, that way you can add stroke-dasharray and stroke-dashoffset directly the svg. But not fill, single line. It will require some work to have the expected result but none the less is possible. I would suggest either doing with CSS, or using library like animatejs for this.
Sadly, downgrading to v13 is the only option if you want static generation for whole application using output:'export'.
No, according to the C standard, simply declaring a const
variable without ever using it does not cause undefined behavior.
The presence of an unused const
variable may lead to compiler warnings or the compiler optimizing it away, but the standard does not define this scenario as undefined behavior. It is purely a matter of optimization and static analysis, not correctness.
In short: Declaring an unused const
variable is allowed and safe; it will not trigger undefined behavior.
Please check the principal (user principal or service principal) have the following configured:
veefu's answer did not work for my case, but it was the right hint.
Here real-world example, needed to easy-compare several AD objects easier in a spreadsheet later:
$User = Get-ADObject -Identity "CN=User,OU=Users,OU=Company,DC=Company,DC=local" -Properties *
$User.psobject.Properties |
Select-Object @{name='Name';expression={$_.Name}},
@{name='Value';expression={$_.Value}} |
Export-Csv -Path User.csv -Encoding UTF8
Depending on your preference and region you might want to add -NoTypeInformation and/or -Delimiter ";".
The computation of CIE ΔE2000 is now available in 10 programming languages in the public domain at https://github.com/michel-leonard/ciede2000.
I just discovered the css
rule: display: content
that can be applied to the child component. This allows it to contain <tr>
's or <td>
's and they will flow naturally in the table.
Syncfusion pins their license keys to specific nuget version (ranges). Go to the syncfusion dashboard and create/request a new license key for the updated nuget.
Even with temperature=0, GPT-4 can still exhibit variability due to backend optimizations like caching, token sampling, and beam search techniques. Additionally, OpenAI may introduce minor updates that subtly affect response generation.
If you're looking for consistent and optimized responses, check out DoCoreAI, which fine-tunes intelligence parameters dynamically instead of relying solely on temperature control. It helps minimize randomness and ensures structured, optimized responses.
👉 Read more about it here: DoCoreAI Blog.
Yes, Many tools & modules of Node.Js and programs provide practicality alike Django Admin. There are third-party tools that Node.Js doesn't have libraries & frameworks that help you create admin dashboards or interfaces for controlling your application.
You can also visit this:https://nextlevelgrow.com/
After my github admin gave me git lfs permission this problem was solved for me.
please check out sites "I always prioritize location when looking for real estate investments. It's the foundation of any good deal!"https://www.eliterealestate.mydt.in/blogs-and-articles/
Did you manage to find the correct approach and implement the new Theming API? Please share.
Sorry i found the problem. i just had to build the model jar binary & then the setter for vacationUUID could be invoked
Go to the XAML code and set the width property to "Auto". If it will just be one line this will work, but if you want two or more to be resized, set height also to "Auto"
I just added
gem "safe_yaml"
Boom!
I realised that they include the references in the response object. So by just using
result.final_output.references
I'll just add these to the model. This would however only give a list of citations used, not what was used for each metric
I think that you should not use the loop " for value in result" because you are inserting "values= result" in the last command i.e. "tree.insert". Using loop is making the same result get inserted in the table value (no of iterations) no of times.
So, after looking for alternatives for a while and checking the log for the publishing I found that before the final published files are moved to the Applications Files
folder in the PublishUrl
, the structure is replicated inside the PublishDir
and deleted once moved to the final destination. Therefore, when doing any action during an "AfterTargets="Publish"
target, if trying to act upon the .deploy
files any command must point to the $(PublishDir)Application Files\...
contents.
My final solution was something like the code shown bellow. The "replaces" are in place so that one can get the version-revision string in a "MyApp_X_X_X_X"
format.
<Target Name="ExecuteSomething" AfterTargets="Publish" Condition="'$(Configuration)' == 'Release'">
<Exec Command='"Path\To\Something.exe"; "$(ProjectDir)$(PublishDir)Application Files\$(AssemblyName)_$(ApplicationVersion.Replace(.,_).Replace(*,$(ApplicationRevision)))\MyApp.dll.config.deploy"'/>
</Target>
Stripe does offer prorated charges through its usage-based billing and subscriptions features. If a user starts using your service in the middle of the month, you can set up prorated charges to ensure they only pay for the portion of the month they've used the service.
Also make sure your stripe does not blocking payments with their "FANCY AI RADAR" system .
If working on VS Code, and Windows, install Python 3.9 from Microsoft Store, open your terminal, make a Virtual Environment with "python3.9 -m venv myenv" and then install "pip install pyrealsense2".
Works perfectly.
It sounds like the issue you're facing with the slow performance on those two PCs might be due to low storage, outdated drivers, or some background processes that are using up resources. A good starting point would be to try Advanced System Optimizer or CCleaner to clean up disk space, update any outdated drivers, and optimize the system overall.
Also, I came across a review of a good PC optimizer that might help you. If you're interested, I’ve attached the link so you can check it out before using. Feel free to reach out if you need any help!
webView.navigationDelegate = self was missing in my viewDidLoad. Thanks to
lazarevzubov to ask the right question
Just to add an alternative solution to the one proposed by @austin-garrett of using RelayState. You can use a cookie with the data you want to track. Here more details: https://stackoverflow.com/a/70738120/16067905
احمد عمر علي معلوماتي وضحوا لي اياها لو سمحت.
mayb this repo can slove this problem. see: https://github.com/Emt-lin/react-native-svga-player
I have the same issue, but with forward and backward slashes in resultSet response/
For example, I expected to receive values "1\Ф" or "1/Ф", but after rs.getString() I had only "1Ф" value for both cases
I wrote some tentative solutions acoording to official resources to avoid this issue https://github.com/nagauta/day-one/tree/master/apps/react-csv
Thank you for your replies!
I tried applying the fix from the pull request [#670](https://github.com/ansible-collections/ansible.netcommon/pull/670), but unfortunately, it does not resolve the issue for me.
I updated `netconf_config.py` with the proposed modifications:
```diff
- from ansible.module_utils._text import to_text
+ from ansible.module_utils._text import to_native, to_text
- confirm = module.params["confirm"]
+ confirm = to_text(module.params["confirm"], error="surrogate_or_strict")
However, I noticed that there is a typo in the to_text()
call. The parameter should be errors
, not error
.
Even after fixing this, I get the following error:
'>' not supported between instances of 'str' and 'int'
I also tried modifying the argument_spec
to change the confirm
parameter type:
"text"
confirm=dict(type="text", default="0")
Error:
argument 'confirm' is of type <class 'int'> and we were unable to convert to text: 'NoneType' object is not callable
"str"
confirm=dict(type="str", default="0")
Error:
'>' not supported between instances of 'str' and 'int'
At this point, I haven't found a working solution.
I have some Python knowledge, but I’m not an expert in Ansible module development, so I’m not sure where to go from here.
If anyone has insights on how to properly handle this, I would greatly appreciate the help! 🚀
I just created a separatee venv, activated it and ran both installation routines
pip install office365-REST-Python-Client
pip install office365
After running
pip list
this package list is shown:
Package Version
---------------------------- -----------
aenum 3.1.15
appdirs 1.4.4
APScheduler 3.11.0
azure-core 1.32.0
azure-storage-blob 12.25.0
beautifulsoup4 4.13.3
bs4 0.0.2
case_conversion 2.1.0
certifi 2025.1.31
cffi 1.17.1
chardet 5.2.0
charset-normalizer 3.4.1
clipboard 0.0.4
colorama 0.4.6
colour 0.1.5
cryptography 44.0.2
cursor 1.3.5
decorator 5.2.1
dill 0.3.9
fuzzywuzzy 0.18.0
gender-guesser 0.4.0
html_text 0.7.0
idna 3.10
imageio 2.37.0
imageio-ffmpeg 0.6.0
infi.systray 0.1.12.1
inflect 7.5.0
isodate 0.7.2
lxml 5.3.1
lxml_html_clean 0.4.1
maybe-else 0.2.1
mbstrdecoder 1.1.4
more-itertools 10.6.0
moviepy 2.1.2
msal 1.32.0
msoffcrypto-tool 5.4.2
numpy 2.2.4
o365 2.1.0
office365 0.3.15
Office365-REST-Python-Client 2.5.14
olefile 0.47
pandas 2.2.3
parsedatetime 2.6
pathmagic 0.3.14
pillow 10.4.0
pip 25.0.1
prettierfier 1.0.3
proglog 0.1.10
pycparser 2.22
pydub 0.25.1
pyinstrument 5.0.1
pyiotools 0.3.18
PyJWT 2.10.1
pymiscutils 0.3.14
PyPDF2 3.0.1
pyperclip 1.9.0
PyQt5 5.15.11
PyQt5-Qt5 5.15.2
PyQt5_sip 12.17.0
pysubtypes 0.3.18
python-dateutil 2.9.0.post0
python-docx 1.1.2
python-dotenv 1.0.1
pytz 2025.1
readchar 4.2.1
regex 2024.11.6
requests 2.32.3
Send2Trash 1.8.3
setuptools 58.1.0
simplejson 3.20.1
six 1.17.0
soupsieve 2.6
tabulate 0.9.0
tqdm 4.67.1
typeguard 4.4.2
typepy 1.3.4
typing_extensions 4.12.2
tzdata 2025.1
tzlocal 5.3.1
urllib3 2.3.0
XlsxWriter 3.2.2
Running of the import routine
from office365.sharepoint.client_context import ClientContext
is occuring this error:
from office365.sharepoint.client_context import ClientContext
ModuleNotFoundError: No module named 'office365'
So the mentioned method in this page is not running.
Does anyone do have a solution?
Just like Nils posted on the comment from @randomuser the possible solution could be to place the git plugin at the button after all the notes are generated.
Also, I added the git plugin as a dependency on my package.json
"devDependencies": {
"@semantic-release/changelog": "^6.0.3",
"@semantic-release/git": "^10.0.1"
}
This is because *ngIf="selectedId() is evaluated as FALSE when value is 0: https://developer.mozilla.org/en-US/docs/Glossary/Falsy
Something like:
<div *ngIf="selectedId()?.toString() as id">Current id: {{ id }}</div>
should do the trick. If you id is null, this will not show anything, but if is 0, it will be evaluate as true as this is a string.
Considering my issue is the same as
https://github.com/jOOQ/jOOQ/issues/14582
(and that issue was closed) , it boils down to a version problem. I'm using Postgres 15 and should not be using a Jooq version which only supports up to Postgres 14.
for linux/bash user: remove all release retentions from a pipeline
#!/bin/bash
PIPELINE_ID=${1}
PROJECT=<PROJECTNAME>
ORGANIZATION=<ORGANIZATIONNAME>
TOKEN="test:<TOKEN>"
leases=$(curl -u "${TOKEN}" -X "GET" "https://dev.azure.com/${ORGANIZATION}/${PROJECT}/_apis/build/retention/leases?api-version=6.0-preview.1&definitionId=${PIPELINE_ID}" | jq .value[].leaseId)
echo $leases
for lease in $leases; do
echo $lease
curl -u "${TOKEN}" -X "DELETE" "https://dev.azure.com/${ORGANIZATION}/${PROJECT}/_apis/build/retention/leases?ids=${lease}&api-version=6.0-preview.1"
done
The token can you create from the "User Settings" dropdown in the right top corner.
You can obtain the pipeline ids from the portal, something like this in the URL "_build?definitionId=42"
or use following command to get all names and ids from a project:
curl -u "${TOKEN}" -X "GET" "https://dev.azure.com/${ORGANIZATION}/${PROJECT}/_apis/build/definitions?api-version=3.2" | jq '.value[] | .id,.name'
The error occurs due to a mismatch in encryption settings between the client and server. To resolve it, update the server’s configuration file (firebird.conf) by setting WireCrypt = Disabled, then restart the server. Ensure the client-side settings match, and try removing wireCrypt if issues persist. Also, check for compatibility between the client and server versions. After making changes, restart both the database and application.
If Executable
imports the entry point which executing the application (for example index.js
), than the debugging is possible by:
node --inspect-brk node_modules/myutil/index.js [options]
Any options of myutil
could be passed at [opiton]
placeholder.
I had multiple startup projects in a clean architecture, ProjectA.API and ProjectA.UI. The project that handles DBContext is ProjectA.Infrastructure. I corrected this issue with the following steps:
Make the API project the only startup project since it interacts with the infrastructure layer alone.
Add Microsoft.EntityFrameworkCore.Design package to the API project and ensure that the version you are using is the same as Microsoft.EntityFrameworkCore.SqlServer in the infrastructure project.
In newest version of ruamel-yaml 0.18.10 . If i want to preserve backslash of multiple line string. How should i do.
I use code like below.
import pathlib
from ruamel.yaml import YAML
control_file = 'xx/nist_ocp4.yml'
yaml = YAML()
yaml.preserve_quotes = True
d = yaml.load(pathlib.Path(control_file))
yaml.dump(d, pathlib.Path(control_file))
Before running code:
controls:
- id: AC-1
status: not applicable
description: "The organization:\n a. Develops, documents, and disseminates to [Assignment:\
\ or whenever a significant change occurs]"
After running code:
controls:
- id: AC-1
status: not applicable
description: "The organization:\n a. Develops, documents, and disseminates to [Assignment:
or whenever a significant change occurs]"
Open Link in Small Window chrome extension: Does not work well in MAC if the browser is on full screen.
I managed to get this to work a little differently, still using the Access DB Engine and some NETFX tools.
Instructions, a test file (a simple .rtf with one text line and an image) used in my tuning and my LibreOffice Basic project file
this solution means that this python project is created and managed by uv
, so there should be had pyproject.toml
in root dir.
so you can run uv sync
to create a venv and install dependencies quickly.
uv sync
: Sync the project's dependencies with the environment.
I am currently working on Azure Function Apps in a .NET 8 isolated environment. In this project, we are trying something new that I have never heard of before. Specifically, I am creating multiple Azure Function projects within a single solution. I successfully set this up, and it worked locally. However, after deploying to Azure, I encountered some errors.
One major issue is that the Timer Triggers I added are not appearing in the Function tab, where triggers are usually displayed. When I checked for errors, I found the following message:
No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g., Azure Storage, ServiceBus, Timers, etc.), make sure you've called the registration method for the extension(s) in your startup code (e.g.,
builder.AddAzureStorage()
,builder.AddServiceBus()
,builder.AddTimers()
, etc.).
Regarding my deployment structure, inside the wwwroot
folder, there are two subfolders, each containing its respective DLLs and host.json
files. I modified my YAML configuration to copy host.json
out of the subfolders so that it exists at the root level, alongside the two folders. However, host.json
is still present inside each subfolder, meaning there are now three host.json
files in total.
I have verified that my trigger functions are correctly implemented, but the issue persists. I am unsure how to proceed. Does anyone have experience with this or any guidance on resolving it?
Im facing the same problem but my code works my variable has the stt data but the terminal is filled with this error every time i use it after every while loop
I am working on the project of the chat board it's a financial chat board I got a task about the OCR my task is related to extract the data from the PDF of the bank statement(I want to give you example like what type want to extract let's suppose you had 50 transactions last month on bank statement in different areas like Uber like restaurant like grocery in this of categories so I want to abstract all the transactions in Jason from like this:""Deposits and Additions": [ { "date": "01/23", "details": { "company_name": "ORIG", "origin_id": "", "date_description": "CO Entr we havey", "name": "UNKNOWN", "id": "" }, "amount": "$344.27" },"", so I am using a different libraries so it will look like static their will some transaction will miss in the extraction data I don't want to miss any transaction how can I make my power more powerful by OCR please suggest me
Here is the scripted solution that I used to implement @Sridevi's answer:
$appname = "YourApplication"
### Connect to Graph (to get the service principal
Connect-MgGraph -ShowBanner:$false
$app = Get-MgServicePrincipal -Filter "displayname eq '$appname'"
Disconnect-MgGraph
### Verify there's exactly one app
$appcount = ($app | measure-object).count
if ($appcount -ne 1) {
throw("$Found $appcount apps with displayname '$appname', this isn't right.")
}
### Connect to IPPS to set everything
Connect-IPPSSession -ShowBanner:$false
$sp = get-serviceprincipal -Identity $app.appid
if (($sp | Measure-Object).count -eq 0) {
try {
$sp = New-ServicePrincipal -AppId $app.appid -ObjectId $app.id -Displayname "$appname - Purge"
} catch {
throw("Can't generate service principal")
}
}
$rolemember = Get-RoleGroupMember -Identity "eDiscoveryManager" | Where-Object { $_.exchangeObjectId -eq $app.id }
if (($rolemember | Measure-Object).count -eq 0) {
Add-RoleGroupMember -Identity "eDiscoveryManager" -Member $app.id
}
$eadmin = Get-eDiscoveryCaseAdmin | Where-Object { $_.exchangeObjectId -eq $app.id }
if (($eadmin | Measure-Object).count -eq 0) {
Add-eDiscoveryCaseAdmin -User $app.id
}
Disconnect-ExchangeOnline
I suspect one of the prime reasons to be the involvement of all the nested and otherwise invisible borders, I've ran into some similar problems in the past and a viable workaround for me is to use tools stronger in extracting text with positional information like pdfplumber. Extracting tables right away appears to be difficult in this case and having a two-fold approach of extracting (not tabular but still correct and well-spaced) text first and then some additional manual parsing on top via tools like regex or parse could be a good way forward.
I now have 2 working solutions, thanks to @rioV8 and @starball:
package.json
- add the following contribution to the correct section:"configurationDefaults": {
"[ABC]": {
"editor.minimap.markSectionHeaderRegex": " ... regex goes here ... "
}
}
activate()
- Additional code within the extension:const scope: vscode.ConfigurationScope = { languageId: "ABC" };
const inspect = vscode.workspace.getConfiguration("editor", scope).inspect("minimap.markSectionHeaderRegex");
if ( !inspect?.workspaceLanguageValue )
{
vscode.workspace.getConfiguration("editor", scope).update("minimap.markSectionHeaderRegex", " ... regex goes here ... ", vscode.ConfigurationTarget.Workspace, true);
}
I went for solution 1. as it seems to be the cleaner solution for the extension I'm working on.
I have the same problem. Did you find a solution?
You need to iterate over the causes of the exception and print that out.
Something like this:
private String getMessage(Exception e) {
var result = e.getMessage();
var cause = e.getCause();
var maxDepth = 3;
while (cause != null && maxDepth-- > 0) {
result = cause.getMessage();
cause = cause.getCause();
}
return result;
}
Replace the 3 with your own logic.
See my exceptionmapper here:
I found an article that explain the meaning of the new blocks-manifest.php file
I guess there is presently a bug in the wordpress (or webpack) scripts, the npm start script remove blocks-manifest.php file from the build firectory.
In the meantime if you want to continue to work on your block until the bug is fixed, you can modify the main plugin file and register your blocks one by one in the old way like this:
function create_block_block_toto_block_init() {
register_block_type( __DIR__ . '/build/toto1' );
register_block_type( __DIR__ . '/build/toto2' );
}
add_action( 'init', 'create_block_block_toto_block_init' );
It will work, blocks-manifest.php will be ignored
If you are attempting to use an AMI from another region, this error happens. The current marked answer saved me a lot of time in figuring out what the root cause was for me.
if you are using a data source to lookup AMI's you will need to create a data source for each region you need to lookup AMIs. If you are using a static list from a table, you will need to make sure your lookup has the AMIs for all possible regions and factors in the region in the lookup.
Hopefully that will be helpful context for someone. I do not have the reputation to add comments to the current answer or I would have.
These plugins can help automate the conversion of Figma designs into React components with Tailwind classes:
Anima is a powerful plugin that converts Figma designs into React code (JSX) with Tailwind CSS support. It generates responsive and pixel-perfect code, including spacing, typography, and layout.
This plugin generates Tailwind CSS classes directly from Figma designs. It helps you quickly apply Tailwind utility classes to your components.
$password = 'JohnDoe';
$hashedPassword = Hash::make($password);
echo $hashedPassword; // $2y$10$jSAr/RwmjhwioDlJErOk9OQEO7huLz9O6Iuf/udyGbHPiTNuB3Iuy
MariaDB natively support VARBINARY: https://mariadb.com/kb/en/varbinary/
You would simply have to set the size value large enough to match your existing data (i.e. to cover for the 'max' provision).
Directly install from GitHub
pip install git+https://github.com/allenai/allennlp.git
This may also create a conflict with the numpy package, so downgrade them
pip install numpy<2.0.0
For me what worked was to use the right path when I was calling the web socket from the front. Something like "wss://a1d5-83-39-106-145.ngrok-free.app/ws/frontend". The "/ws/frontend is very important.
In my case, after updating XCode and the Simulator, I started encountering the error:
“Framework ‘Toast’ not found”
For me, the issue was resolved by downgrading the fluttertoast version from:
fluttertoast: ^8.2.12 → fluttertoast: 8.2.8
After changing the fluttertoast version to 8.2.8, the error disappeared, and the build was successful.
I’m not sure about sqflite, but you might want to give it a try.
Recruitment SOP
1. Purpose
The purpose is to establish a clear and consistent process for recruiting employees to ensure that the organization attracts, hires, and retains the best talent.
2. Scope
This SOP applies to all positions within the organization and covers all stages of the recruitment process, from job requisition to final offer acceptance & On boarding
3. Roles and Responsibilities
AGM HR
Oversee the entire recruitment process.
Ensure that all recruitment activities are in compliance with legal and organizational standards.
Approve job requisitions and advertisements.
Ensure proper documentation and records are maintained.
Hiring Manager:
Obtaining approval in MRF
Define job requirements, skills, and competencies.
Participate in the interview process.
Provide feedback and make final hiring decisions.
Recruitment Team
Post job ads and handle initial screening of candidates.
Sourcing and screening of candidates.
Scheduling for Interviews.
Coordination with the Stakeholders for interview process
Evaluating the candidates.
Approval in Appointment Clearance form.
Issuing Offer letter to the new joining candidate
Candidates:
Submit applications and required documentation as per the job posting.
Participate in interviews and assessments as scheduled.
4. Recruitment Process Steps
Step 1: Job Requisition
The Hiring Manager submits a job requisition request to the HR department, which includes:
Job title
Job description
Required skills and qualifications
Salary range and benefits
Any other relevant details
Step 2: MRF Approvals
· The Hiring manager or department head submits a formal request for new manpower or replacement of an existing employee.
Step 3: Job Advertisement
The recruitment team creates and posts the job advertisement on relevant platforms (company website, job boards, social media, recruitment agencies, etc.).
Ensure job postings are consistent, clear, and inclusive.
Job postings should remain open for a specific period (e.g., 2-4 weeks).
Step 4: Resume Screening
The recruitment team reviews the applications and shortlists candidates based on qualifications, experience, and skills.
Ensure that candidates meet the basic job requirements before moving forward.
Step 4: Initial Interview
The recruitment team conducts a preliminary phone or video interview to assess the candidate’s suitability for the role.
Questions should focus on general qualifications, skills, and fit within the company culture.
Step 5: Assessment/Skill Tests (if applicable)
Some positions may require candidates to complete skill assessments, coding tests, or personality evaluations.
The Hiring Manager or HR team will coordinate these tests and evaluate the results.
Step 6: Final Interview
A face-to-face or virtual interview with the Hiring Manager and Director
Interviews should assess technical expertise, problem-solving skills, and cultural fit.
Step 6: Salary Fixation & Negotiation
Salary Discussion: During the interview or offer stage, HR or the hiring manager discusses salary expectations with the candidate. Ensure clear communication about compensation components (base salary, bonuses, benefits, etc.).
Negotiation Process: The candidate may propose a salary higher than initially offered. In this case, HR or the hiring manager must negotiate within the company’s pay structure and budgetary limits, considering the candidate’s expectations and the organization's constraints.
Step 7: Reference and Documentation Checks
Conduct reference checks from previous employers to verify candidate experience and performance.
Perform background checks (criminal, educational, and employment verification) as per the organization's policy.
Step 8: Appointment Clearance
HR verifies that the candidate has met all internal requirements and procedures for appointment. This includes checking the completion of all documentation and background checks.
The hiring manager confirms that the candidate meets the expectations and requirements for the role, including fit within the team and company culture.
A signed approval on the proposal document from the Hiring Head and respective Director
Step 8: Job Offer
Once a candidate is selected, the HR department prepares a job offer letter that includes:
Job title
Terms and conditions of employment
The HR Manager and TA will discuss the offer with the selected candidate.
Step 9: Offer Acceptance and On boarding
Once the candidate accepts the offer, HR prepares for on boarding:
Provide employment contract and any required documents.
Set up an orientation session to introduce the new hire to the company, its culture, and systems.
Step 9: Background Verification
· Background verification is typically initiated after the candidate has successfully passed the interview process and has been selected for the role, but before the final appointment or offer letter is issued.
5. Documentation and Record-Keeping
Ensure that all recruitment-related documentation (MRF, applications, interview notes, job offers, and background checks) is stored securely.
Maintain a record of all job postings and the candidates who were interviewed, along with the final decision.
7. Continuous Improvement
Regularly review and evaluate the recruitment process to ensure its effectiveness.
Seek feedback from candidates and hiring managers to identify areas for improvement.
Overall Recruitment TAT: Establish an overall recruitment process TAT target, such as 30–45 days from job requisition to candidate hiring.
8. Appendices
Manpower Request Form
Interview Evaluation Form
Appointment Clearance Form
Job Appointment Letter Template
I have explained all the steps with screenshots, see if it help
https://github.com/CMS365-PTY-LTD/EbaySharp?tab=readme-ov-file#access-and-security
My answer is not very scientific but in my case the reason was messing with the diagnostics options!
the uploaded image is currently working fine!
Firstly, thank you to @Alexander Zeitler. I am posting a complete example here of what worked for me. Alexander's code did not work for me out of the box as I had to adjust the form data to suit my purposes.
internal async Task PostIt()
{
var username = "<Your username here>";
var password = "<Your password here>";
var baseAddress = new Uri("<Your POST Url here>");
var cookieContainer = new CookieContainer();
using (var handler = new HttpClientHandler() { CookieContainer = cookieContainer, AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate })
using (var client = new HttpClient(handler) { BaseAddress = baseAddress })
{
client.DefaultRequestHeaders.Add("accept-encoding", "deflate");
client.DefaultRequestHeaders.Add("user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36");
client.DefaultRequestHeaders.Add("Origin","<Origin URL here>");
client.DefaultRequestHeaders.Referrer = new Uri("<Referrer URL here>");
client.DefaultRequestHeaders.Add("connection", "keep-alive");
client.Timeout = TimeSpan.FromMilliseconds(5000);
// Add authorization headers here...
var byteArray = new UTF8Encoding().GetBytes($"{username}:{password}");
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(byteArray));
/*
Next comes the form data. I had to customize this to meet the requirements of the <baseAddress> server.
Example: A Http debugger showed me when I was logging into <baseAddress>, the 'content' posted was as such:
"login_username=<username here>&login_password=<password here>&redirect_url=<redirect URL here>&site=www"
@Alexander Zeitler's code has thus been modified below.....
*/
var formData = new List<KeyValuePair<string, string>>();
formData.Add(new KeyValuePair<string, string>("login_username", username));
formData.Add(new KeyValuePair<string, string>("login_password", password));
formData.Add(new KeyValuePair<string, string>("redirect_url", "<redirect URL here>"));
formData.Add(new KeyValuePair<string, string>("site", "www"));
var request = new HttpRequestMessage(HttpMethod.Post, baseAddress);
request.Content = new FormUrlEncodedContent(formData);
var response = await client.SendAsync(request);
if (response.IsSuccessStatusCode)
{
//do whatever........
}
else
{
}
}
}
Have you seen this article about working with the soft keyboard input mode on Android in .NET MAUI? https://learn.microsoft.com/en-us/dotnet/maui/android/platform-specifics/soft-keyboard-input-mode?view=net-maui-9.0
Deploying with Azure CLI, I found that I got an the:
'The parameter WEBSITE_CONTENTSHARE has an invalid value.'
error because my function app --name flag had an underscore in it.
For anyone else with this error - try removing the underscores.
This Restrict Emoji Input and Enforce Character Length in Swift (with UITextField & UITextView)
It is therefore possible to limit writes as mentioned above. However, it is still impossible to limit reads, especially on public data.
The solution I'm thinking of implementing is to create a NestJS API (or similar) that takes a Firebase Query as input and executes it.
This also requires taking into account real time reads with websockets and update/create/delete operation.
This is a real shame, as the promise of firebase is to dispense with the need for an API. But with this system, it could be easy to implement external tools like CloudFlare or Google Cloud Armor to protect against DDOS.
The solution isn't ideal, however; if a security rule made this possible, it would be better.
You can vote for this feature and post comments to ask the Firebase team to speed up development on it.
I have same issue; I found that the read operation is blocked by Trellix Endpoint Security for Linux Threat Prevention (mfetpd).
....................... It's resolved now .......................
One thing you can do is combine the specs into a single "batch".
If you make a spec file called, for instance, home-about.spec.js
that imports the home spec and about spec
import './home-page-test.spec.js'
import './about-page-test.spec.js'
The command yarn cypress run --spec cypress/e2e/home-about.spec.js
produces the following results
That method might be a bit simplistic, depending on what your npm run home-page-test
, npm run about-page-test
, etc commands are doing.
It might be better to create a NodeJs script and running the Cypress API Module, where you can add various options and also process results and errors.
// e2e-run-tests.js
const cypress = require('cypress')
cypress
.run({
spec: [
'./cypress/e2e/home-page-test.spec.js',
'./cypress/e2e/about-page-test.spec.js',
]
})
Run it with the command node e2e-run-tests.js
I also face the same issue ,it resolve when you have put the correct path in your config file for base :
const viteConfig = {
plugins: [react(), svgr()],
base: "./",
server: {
port: 3000,
open: true,
},
build: {
target: "esnext",
outDir: "dist",
sourcemap: true,
rollupOptions: {
output: {
manualChunks: {
vendor: ["react", "react-dom"],
},
},
},
},
};
According to the flink-kubernetes-operator documentation (See: https://nightlies.apache.org/flink/flink-kubernetes-operator-docs-main/docs/operations/configuration/)
With job.autoscaler.vertex.min-parallelism
you can set the minimum parallelism of your pipeline.
Everyone! I'm new to the forum and would like to help, as I ran into the same issue. A picture is worth a thousand words, ergo...
I know this is old, but I basically use abstract as if I am telling Google what the article is about in 1 long descriptive sentence.
I have found the solution for my problem in the stackoverflow
<button mat-flat-button class="tertiary-button">Tertiary</button>
.tertiary-button {
@include mat.button-color($light-theme, $color-variant: tertiary);
}
The issue I was having was due to having the Kestrel configuration in appsettings.json as well as program.cs. For some reason, it was adding another Kestrel (instance?) to the one already configured in the program.cs code. The Kestrel config and https cert loaded in program.cs had already been working with the same Dockerfile and compose.yml. I only noticed it when the same error got triggered at App.run in program.cs while previously it triggered at the certificate loading code also in program.cs