make postcss.config.js -> postcss.config.mjs
This took forever for me to figure out. The related things people mention about cron are unrelated. You need to go to Settings->General->Sharing->Remote Login click on the "i", and enable "Allow full disk access for remote users"
from zipfile import ZipFile
import os
# Crear estructura de carpetas para simular los dos PowerPoints
os.makedirs('/mnt/data/Matrix_PPT/Ejemplo - cuento', exist_ok=True)
os.makedirs('/mnt/data/Matrix_PPT/Ejemplo - QTE', exist_ok=True)
# Crear archivos vacíos simulando los PowerPoints
with open('/mnt/data/Matrix_PPT/Ejemplo - cuento/Ejemplo - cuento.pptx', 'w') as f:
f.write('Presentación de ejemplo - cuento Matrix')
with open('/mnt/data/Matrix_PPT/Ejemplo - QTE/Ejemplo - QTE.pptx', 'w') as f:
f.write('Presentación de ejemplo - QTE Matrix')
# Crear un zip con ambos archivos
zip_path = '/mnt/data/Ejemplos_Matrix_PPT.zip'
with ZipFile(zip_path, 'w') as zipf:
zipf.write('/mnt/data/Matrix_PPT/Ejemplo - cuento/Ejemplo - cuento.pptx', arcname='Ejemplo - cuento.pptx')
zipf.write('/mnt/data/Matrix_PPT/Ejemplo - QTE/Ejemplo - QTE.pptx', arcname='Ejemplo - QTE.pptx')
zip_path # Mostrar la ruta del archivo zip generado
I made a Python+GTK all-in-one (AIO) archive with Python 3.11.10, GTK 3.24.43, and PyGObject 3.51.0.
PyGObject and GTK are painful to build and install on Windows, so I made a Python+GTK all in one archive with Python 3.11.10, GTK 3.24.43, and PyGObject 3.51.0.
I made a Python+GTK all in one archive with Python 3.11.10, GTK 3.24.43, and PyGObject 3.51.0. It builds with MSVC on GitHub Actions, so it's reproducible and easier to maintain. It's less like an installer and more like a portable application, but it can be packaged in your favorite installer (e.g., NSIS, InnoSetup).
build: {
rollupOptions: {
external: [
'@emotion/react',
'@emotion/styled',
'framer-motion'
],
Did you ever manage to fix it? I'm having same issue whereby embeddings are being retrieved correctly, but the agent simply sends a "output":"" as a response?
Below are the steps to upgrade the node version using NVM :
Open the console and type below command : nvm list
To install the <NODE_VERSION> version run the below command : nvm install <NODE_VERSION>
After the installation is complete it will add the node version in the nvm. To check the same type command : nvm list
To switch the Node version type the below command: nvm use <NODE_VERSION>
To verify the node version switch type the below command:
node –v
For reference : https://www.erpluse.com/2018/11/nvm-installation-on-windows.html
I have been working on a sheet which has been working AOK with
`
var UI = SpreadsheetApp.getUi();
`on the fifth line. Suddenly I was getting the can not call from context message. Landed here and decided I would try to close and open the sheet and voila, error is gone. Strange goings on......
Using @Quentin's advice in comments, as a workaround I am able to make it work using translateY as an external property, instead of getting previous one from DomMatrix.
const box = document.querySelector('.box');
let id = undefined;
let translateY = 0;
function animateTest() {
box.style.transform = `rotate(12deg) translateY(${translateY}px)`;
translateY = translateY+2;
id = requestAnimationFrame(() => animateTest());
}
id = requestAnimationFrame(() => animateTest());
let button = document.querySelector('.button');
button.addEventListener('click', function() {
cancelAnimationFrame(id);
translateY = 0;
id = requestAnimationFrame(() => animateTest());
}, false);
setTimeout(()=> {cancelAnimationFrame(id)}, 5000);
.box {
margin-left: 300px;
width: 10px;
height: 10px;
background-color: black;
transform: rotate(15deg);
}
<button class='button'>rerun</button>
<div class="box"></div>
Found that the issue was when I originally passed the data from the postgres table to the ejs file in the GET route not included above, I had only included the two columns that showed up rather than the entire table.
Changed database query from this:
db.query("SELECT task_description, color FROM tasks WHERE section='To Do'");
To this:
db.query("SELECT * FROM tasks WHERE section='To Do'");
If all you want to do is add a volume.
The JSON patch format is probably better.
Example below to start a container without the database running.
kubectl run -it --rm psql \
--image=tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:90724186f0a3517cf6914295b5ab410db9ce23190a2d9d0b9dd6463e3fa298f0 \
--override-type=json \
--overrides='
[
{"op": "add", "path": "/spec/containers/0/volumeMounts", "value": [{"mountPath": "/var/lib/postgresql/data", "name": "pgdata"}]},
{"op": "add", "path": "/spec/volumes", "value": [{"hostPath": {"path": "/host/db"}, "name": "pgdata"}]}
]
' --command -- sh
I'm not 100% sure but shouldn't the drop table command be
log.info(f"Dropping Table: mygrc.{table_name}")
dynamic_class.__table__.drop(dbEngine, checkfirst=True)
I think you're forgetting the dot just after dynamic_class
.
Also maybe instead of dropping the table, you can use the constructor with the extend_existing parameter set to true?
This can be closed.
A co-worker fixed the issue, so certain comments have a value going towards 0 from below.
I already decided that I will use our company maven mirror and move there the jars - not all have a maven dependency.
You can check this package : djang url group permissions https://pypi.org/project/django-url-group-permissions/
A Django package for managing URL-based permissions through user groups with HTTP method support
У меня было три таблицы по первой сделан срез
В меню макросов в листе вставлен код по отслеживанию изменений на листе в срезе и потом если изменения есть запускается процедура по приведению других таблиц к срезу в первой
' Флаг для предотвращения рекурсии
Private IsSyncing As Boolean
' Переменная для хранения предыдущего состояния среза
Private previousSelectedValues As String
Private Sub Worksheet_Calculate()
' ЭТОТ КОД ДОЛЖЕН БЫТЬ В МОДУЛЕ ЛИСТА
Dim sc As SlicerCache
Set sc = ThisWorkbook.SlicerCaches("Срез_РЦ") ' Укажите имя вашего кэша
If sc Is Nothing Then Exit Sub
' Получаем текущие выбранные значения
Dim currentSelectedValues As String
currentSelectedValues = GetSelectedSlicerItems(sc)
' Сравниваем с предыдущим состоянием
If currentSelectedValues <> previousSelectedValues Then
previousSelectedValues = currentSelectedValues
' Запускаем синхронизацию фильтров
If Not IsSyncing Then
IsSyncing = True
СинхронизироватьФильтры
IsSyncing = False
End If
End If
End Sub
Function GetSelectedSlicerItems(sc As SlicerCache) As String
Dim result As String
Dim si As SlicerItem
For Each si In sc.SlicerItems
If si.Selected Then
result = result & si.Value & vbCrLf
End If
Next si
GetSelectedSlicerItems = result
End Function
и потом с помощью процедуры в модуле вставлен код приведения срезов талбицы
' Этот код должен быть в ОБЫЧНОМ МОДУЛЕ (например, Module1)
Sub СинхронизироватьФильтры()
' Эта процедура отображается в списке макросов (Alt + F8)
Dim sc As SlicerCache
Set sc = ThisWorkbook.SlicerCaches("Срез_РЦ") ' Укажите имя кэша среза
If Not sc Is Nothing Then
' Собираем выбранные элементы
Dim selectedItems As Collection
Set selectedItems = New Collection
Dim si As SlicerItem
For Each si In sc.SlicerItems
If si.Selected Then selectedItems.Add si.Value
Next si
' Применяем фильтры к таблицам
ApplyFilterToTable "План", "РЦ", selectedItems
ApplyFilterToTable "Выполнение", "РЦ", selectedItems
'MsgBox "Фильтры синхронизированы!", vbInformation
Else
MsgBox "Срез не найден!", vbExclamation
End If
End Sub
Sub ApplyFilterToTable(tableName As String, field As String, items As Collection)
' Процедура для фильтрации таблиц
Dim ws As Worksheet
Set ws = ThisWorkbook.Sheets("Показатели запрос") ' Укажите имя листа
Dim lo As ListObject
Set lo = ws.ListObjects(tableName)
' Находим индекс столбца
Dim colIndex As Integer
colIndex = lo.ListColumns(field).Index
' Снимаем текущий фильтр
On Error Resume Next
lo.Range.AutoFilter field:=colIndex
On Error GoTo 0
' Применяем новый фильтр
If items.Count > 0 Then
Dim criteria() As String
ReDim criteria(1 To items.Count)
Dim i As Integer
For i = 1 To items.Count
criteria(i) = CStr(items(i))
Next i
' Используем константу 7 (xlFilterValues для русской версии)
lo.Range.AutoFilter _
field:=colIndex, _
Criteria1:=criteria, _
Operator:=7
End If
End Sub
Problema com envio de e-mails no projeto Java utilizando Jakarta Mail
Contexto: No projeto Java utilizando Jakarta Mail, ocorreu um erro relacionado à falta da implementação completa da API de email. O erro específico gerado foi:
java.lang.IllegalStateException: No provider of jakarta.mail.util.StreamProvider was found
Solução: O problema foi resolvido ao remover as dependências:
jakarta.mail-api-2.1.0.jar
jakarta.activation-api-2.1.2.jar
Estas não forneciam uma implementação completa da API de e-mail. A solução foi substituir essas dependências pela dependência:
com.sun.mail:jakarta.mail versão 2.0.1.
Dependência Maven:
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>jakarta.mail</artifactId>
<version>2.0.1</version>
</dependency>
Essa dependência fornece a implementação completa necessária para o envio de e-mails.
Para enviar para o trabalho, use uma senha de aplicativo e ative a verificação em duas etapas na sua conta, pois o Google desativou o recurso de acesso a aplicativos menos seguros
Uma senha de app tem 16 dígitos e autoriza o acesso à sua Conta do Google por um dispositivo ou um app menos seguro. As senhas de app podem ser usadas apenas em contas que tenham a verificação em duas etapas ativada.
site para cadastrar: https://support.google.com/accounts/answer/185833?p=InvalidSecondFactor
só depois disso conseguimos fazer tode o processo para envio de email
I don't have the reputation to comment on 'Jason's post above, but that is the correct answer according to Docker's patch notes here: https://docs.docker.com/desktop/release-notes/
For Mac
- Downgraded Linux kernel to v6.10.14 to fix a bug in OpenJDK that causes Java containers to terminate due to cgroups controller misidentification. See docker/for-mac#7573.
- Added /usr/share/misc/usb.ids in the root mount namespace to fix usbip.
- Fixed an issue where the display of the CPU limit was capped at 8 when using Docker VMM.
- Fixed an issue where startup would hang and the com.docker.backend process consumed 100% of the CPU. See docker/for-mac#6951.
- Fixed a bug that caused all Java programs running on M4 Macbook Pro to emit a SIGILL error. See docker/for-mac#7583.
- Blocked startup on macOS 15.4 beta 1 since starting VMs will cause the host to crash, see https://developer.apple.com/documentation/macos-release-notes/macos-15_4-release- notes#Virtual-Machines.
- Fixed an issue where the myIPAddress PAC file function retrieved the host IP from the wrong interface, causing incorrect proxy selection.
A cl_program
can have an associated cl_context
, which is tied to a specific device. In this case, only building for the devices tied to the cl_context
makes sense, and so device list can be left NULL
.
Could you provide more details about the issue you're facing with clean_names
from the janitor
package? It would be helpful to know if you're encountering any error messages, as well as how your dataset looks before and after applying the function. Additionally, knowing the versions of janitor
and pandas
you're using could help diagnose the problem more effectively. With this information, we can better understand the issue and provide a solution. Looking forward to your response!
i have done this using ref <img src="/eye.svg" width={20} /
//so i want to change this img src
1)add a function to it using on click and a ref
<img src="/eye.svg" width={20} ref={ref} onClick={Function_name}/
2)Here's the function code we used src.includes to check if string exist or you can say the previous path exist i have made this fuction to toggle img on click if you just want to change the src for once you should do then this.
a)without toggle const ref= useRef()
const Function_name = () => {
ref.current.src = "public/hidden.svg"
}
b) with toggle
const ref= useRef()
const Function_name = () => {
if(ref.current.src.includes("public/eye.svg")){
ref.current.src = "public/hidden.svg"
}
else{
ref.current.src = "public/eye.svg"
}
}
You can simply visit the link below and download Microsoft Visual C++ Redistributable that match to your system requirements.
https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist?view=msvc-170
The GitHub team recommends using git lfs migrate
https://github.com/git-lfs/git-lfs/issues/326#issuecomment-234317594
https://github.com/git-lfs/git-lfs/blob/main/docs/man/git-lfs-migrate.adoc
Can anyone help decode what I need to do to fix this? Latest Octave on Sequoia and can't get control or signal:
3 warnings generated.
ld: warning: duplicate -bundle_loader option, '/usr/local/Cellar/octave/9.4.0/bin/octave-9.4.0' ignored
ld: warning: search path '/usr/local/opt/gcc/bin/../lib/gcc/current/gcc/x86_64-apple-darwin23/14' not found
ld: warning: search path '/usr/local/opt/gcc/bin/../lib/gcc/current/gcc/x86_64-apple-darwin23/14/../../..' not found
ld: library 'emutls_w' not found
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [__control_slicot_functions__.oct] Error 1
cd slicot/src && \
/usr/local/Cellar/octave/9.4.0/bin/mkoctfile-9.4.0 -w -c MA02ID.f; mv MA02ID.f x && \
/usr/local/Cellar/octave/9.4.0/bin/mkoctfile-9.4.0 -c *.f ../../TG04BX.f ../../src_aux/*.f && \
mv x MA02ID.f
/usr/local/Cellar/octave/9.4.0/bin/mkoctfile-9.4.0 -Wall __control_helper_functions__.cc
ar -rc slicotlibrary.a slicot/src/*.o
LDFLAGS="-L/usr/local/Cellar/octave/9.4.0/lib/octave/9.4.0 -L/usr/local/Cellar/octave/9.4.0/lib -bundle -undefined dynamic_lookup -bind_at_load -bundle_loader /usr/local/Cellar/octave/9.4.0/bin/octave-9.4.0 -L/usr/local/opt/openblas/lib -lopenblas -L/usr/local/opt/gcc/bin/../lib/gcc/current/gcc/x86_64-apple-darwin23/14 -L/usr/local/opt/gcc/bin/../lib/gcc/current/gcc -L/usr/local/opt/gcc/bin/../lib/gcc/current/gcc/x86_64-apple-darwin23/14/../../.. -lemutls_w -lheapt_w -lgfortran -lquadmath" \
/usr/local/Cellar/octave/9.4.0/bin/mkoctfile-9.4.0 -Wall __control_slicot_functions__.cc common.cc slicotlibrary.a
error: pkg: error running 'make' for the control package
error: called from
configure_make at line 117 column 9
install at line 202 column 7
I had this error using <p-inputnumber>
, which turned out to be a capitalization issue. Angular didn't recognize <p-inputnumber>
as a component because tags are case-sensitive, i.e. <p-inputNumber>
. Need to pay more attention to the case.
I'm also dealing with the same issue. Did you get a solution for this? Please respond. Thanks
This was simply a user error. I'd been using the csrf package and hadn't spotted that the package to use next time around was a subtly different name - csurf.
Check this https://pypi.org/project/django-url-group-permissions/
A Django package for managing URL-based permissions through user groups with HTTP method support
As said @herrstrietzel, kerning won't be applied to text with different layout context.
And as mentioned @BehRouz you can workaround by applying a negative margin-left (or margin-inline-start) with em
.
With javascript and opentype.js (getKerningValue), you can get all the kernings.
I know this is an old post but I thought I would share my answer. We handle names from all over the world in our database. It came to my attention that we actually had names with a mixture of the Romanian diacritical characters and Latin diacritical characters. This cause the accepted answer to not work. Lucky for us we keep up to date on our SQL database and in 2019 version they introduced UTF8 versions of collations. What fixed our queries was to use Latin1_General_100_CI_AI_SC_UTF8 collation in the like clause and it now can find all that match even when the ascii values are what the user types in instead of the unicode. Example: "Ştefănuţ" finds all "Stefanut" and "Ştefănuţ" variations and vise versa.
I make a mistake.
My lib
folder already existed a index.js
file that path is: project_root/lib/index.js
and the main
in package.json is "lib/index.js"
, but my TypeScript file path is project_root/src/index.ts
so I always deployed same/old file to my Firebase, Oh my God...
So I changed "main": "lib/index.js"
-> "main": "lib/src/index.js"
to fix this issue.
To keep both the link and the tr
<TableRow>
<Link to={`/example`} style={{ display: 'contents', verticalAlign: 'middle' }}>
{columns.map((column) => (
<TableCell key={column}>{renderCellContent(column)}</TableCell>
))}
</Link>
</TableRow>
This solution https://stackoverflow.com/a/71777413/5407635 works but in the IDE WebStorm there is an error because v-theme-green in not recognize.
you need to draw a rectangle using the return of d.begin_shader_mode(&shader) not just with d.draw_rectangle() to render the shader
Solved this by using
{ encoding: 'jsonParsed'}
which gives the parsed data including balance and token mint
This reference lead me in the right direction: Upgrading nuget package to 8.0.0 breaks UI for versioned APIs in asp.net core
While I am not yet using OpenAPI generation, simply clearing the MS Edge browser cache did the trick for me.
To expand on @e-shcherbo's dotnetfiddle, and for anyone wanting to leverage more functionality provided by Markdig's header parsing, the following query enables the generation of a table of contents using the headers parsed by Markdig.
The presentation result would be similar to the 'In this article' list provided in the sidebar of a MDN article.
Add the.UseAutoIdentifiers()
extension to the Markdig pipeline so that an id
attribute is assigned to each HeadingBlock
AST node during parsing. The ID is generated using the text of the header.
IEnumerable<Header> headers = document
.Descendants<HeadingBlock>()
.Where(hb => hb?.Level != null && !string.IsNullOrWhiteSpace(hb?.Inline?.FirstChild?.ToString()))
.Select(hb => new Header()
{
Id = hb?.GetAttributes().Id,
Level = hb?.Level,
Text = hb?.Inline?.FirstChild?.ToString(),
});
The following is the result of the above LINQ query on the Markdown sample listed in the original post:
The ID value is retrieved using the GetAttributes()
method on a HeadingBlock
node.
The Level
property on the HeadingBlock
node is the key value that enables a branching table of contents sidebar.
You can extend the Where()
predicate to retrieve only level 2 (h2
) headers (e.g. .Where(hb => hb?.Level == 2)
), similar to the list provided in the 'In this article' MDN article sidebar.
The .Where()
predicate skips HeadingBlock
nodes that have an empty string.
The Header
type in the .Select()
method is defined as:
public class Header
{
public string? Id { get; set; }
public string? Text { get; set; }
public int? Level { get; set; }
}
Your approach is great, but won't work specifically with managed Kafka Connect. When you submit a JAR as a plugin, it doesn't get added to the classpath of kafka connect worker. It goes into a plugin dir. Each plugin has it's own classloader for isolation of the dependencies, etc. This is how Kafka Connect workers are using plugins.
So, this means that the MirrorSource Connector won't see the libraries and classes for a custom replication policy.
Alternatively, if you are just trying to achieve same topic name replication, there is an IdentityReplicationPolicy available in Kafka 3. To use it, define your MM2 connector to run in MSK Connect using version 3.7
I can confirm the ojdbc6-11.2.0.1.0.jar that came with Oracle 11.2.0.1.0 can work with Oracle 19c if you set SQLNET.ALLOWED_LOGON_VERSION_SERVER=11 on the server side. The same should hold for ojdbc6-11.2.0.4.jar.
According to OJDBC FAQ under the question "Oracle JDBC releases Vs JDK versions" the last ojdbc6 came with Oracle 12.2 but I don't have experience with this one and I can't find it in Maven repository.
Option 1: Use the CDN (Quick & Easy, No Build Process)
Add the following CDN link inside the <head> tag of your index.html file:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Tailwind CSS in HTML</title>
<script src="https://cdn.tailwindcss.com"></script>
</head>
<body class="bg-gray-100 text-center p-10">
<h1 class="text-3xl font-bold text-blue-600">Hello, Tailwind CSS in HTML!</h1>
</body>
</html>
Option 2: Install Tailwind via NPM (Recommended for IntelliSense & Customization)
If you want Tailwind IntelliSense to work in HTML, follow these steps:
1. Initialize a project (if you don’t have package.json already)
npm init -y
2.Install Tailwind CSS locally
npm install -D tailwindcss postcss autoprefixer
3.Generate a Tailwind config file
npx tailwindcss init -p
4.Configure Tailwind to scan your HTML files (In tailwind.config.js, update the content section)
/** @type {import('tailwindcss').Config} */
export default {
content: ["./*.html"],
theme: {
extend: {},
},
plugins: [],
}
5.Create a CSS file for Tailwind (Inside your project folder, create a new file styles.css and add this)
@tailwind base;
@tailwind components;
@tailwind utilities;
6.Build Tailwind CSS (Run the following command to generate a Tailwind-ready CSS file)
npx tailwindcss -i ./styles.css -o ./dist/output.css --watch
Try to use a GRAPH request instead of "HTML To Text" action - note CustomHeader1 content below.
See the whole sample at https://community.powerplatform.com/galleries/gallery-posts/?postid=952229ef-5a0a-f011-bae2-6045bdedea0b
To answer a comment @Saurabh Verma, How can I parse nested tables in HTML using Jsoup and convert the extracted data to JSON?
I had a use case where I needed to parse nested tables in HTML and extract the data into a JSON format. Below is an example how to navigate through nested tables and extract data using Jsoup.
Element contentNodesTable = doc.select("td:contains(Content Nodes:) + td table").first();
public class JsoupExample {
public static void main(String[] args) {
String html = "<table border="1"><tr><td>Start Date:</td><td>2024-03-27 04:04:47.612PM</td></tr><tr><td>End Date:</td><td>2024-03-27 04:04:47.737PM</td></tr><tr><td>Duration:</td><td>0d 0h 0m 0s 125.237ms</td></tr><tr><td>Successful:</td><td>Yes</td></tr><tr><td>Content Nodes:</td><td><table border=\"1\" cellspacing=\"0\" cellpadding=\"1\"><tr><td># Created</td><td># Replaced</td><td># Skipped</td><td>Data Written</td><td># Properties</td></tr><tr><td>0</td><td>1</td><td>0</td><td>180.37kB</td><td>26</td></tr></table></td></tr></table>";
Document doc = Jsoup.parse(html);
Map<String, Object> rep = new HashMap<>();
rep.put("Start Date", doc.select("td:contains(Start Date:) + td").text());
rep.put("End Date", doc.select("td:contains(End Date:) + td").text());
rep.put("Duration", doc.select("td:contains(Duration:) + td").text());
rep.put("Successful", doc.select("td:contains(Successful:) + td").text());
Map<String, Object> contentNodes = new HashMap<>();
Element contentNodesTable = doc.select("td:contains(Content Nodes:) + td table").first();
// Extract data from the inner table
Elements rows = contentNodesTable.select("tr");
Elements headers = rows.get(0).select("td");
Elements values = rows.get(1).select("td");
for (int i = 0; i < headers.size(); i++) {
String header = headers.get(i).text();
String value = values.get(i).text();
contentNodes.put(header, value);
}
rep.put("Content Nodes", contentNodes);
System.out.println("JSON Response: " + rep);
}
}
There's no need to upgrade to React 19 to use AG Grid v33.2.1.
AG Grid v33.2.1 fully supports React 18 as per the React compatibility table: https://www.ag-grid.com/react-data-grid/compatibility/#ag-grid--react-compatibility-chart
Having a quick look, I wonder if it's because "SelfAsserted-LocalAccountSignin-Phone-Email" doesn't have PartnerClaimType="Verified.Email" set?
Thank you life888888 for such a detailed answer above, taking inspiration from it, I did something simpler for my use case to make it work. Instead of trying to create the binary in Dockerfile itself, I simply used the Dockerfile to create the environment I wanted matching the one I had on the host. This is defined in the BASE_IMAGE
argument where it pull the image I needed (which had the right OS for linux and go installed). So this is simply how my Dockerfile looked:
FROM ${BASE_IMAGE}:${BASE_TAG} as base
WORKDIR /workspace
COPY go.mod go.mod
COPY go.sum go.sum
ADD . go_mylib
RUN go mod download
Then I build the container on my M1 Mac using the command:
$ docker build -t go_mylib:v1 --platform linux/amd64 .
Run the image using:
$ docker run -i -t --sysctl net.ipv6.conf.all.disable_ipv6=0 --platform linux/amd64 --name go_mylib-v1 go_mylib:v1 /bin/bash
Once inside the container's bash, I go into the project folder and then run the command to create a shared library:
# cd go_mylib/
# go build -o libmybinary.so -buildmode=c-shared main.go
And finally exiting the container, I copy the binary generated to my local folder using this command:
$ docker cp go_mylib-v1:/workspace/go_mylib/libmybinary.so .
(where /workspace/go_mylib/libmybinary.so
is the path of the file inside my container and .
refers to the current folder in my local system.
I finally load this library from my Java code by using:
MyLib INSTANCE = Native.load("mybinary", MyLib.class);
and it works on the host as expected.
You will also need to customize the LinkHandlers. There is an example in the docs for how to add nofollow to all links. You will need to modify it to accept your dynamic attribute. https://docs.wagtail.org/en/stable/extending/rich_text_internals.html#registering-rewrite-handlers
Be aware of a new vulnerability: https://www.wiz.io/blog/ingress-nginx-kubernetes-vulnerabilities
From this:
https://community.dremio.com/t/is-parameterized-sql-not-supported-by-dremio/4650/2
It appears that as of now the ODBC driver does not support parameterised queries which would likely explain this.
The current driver is 0.9.1 from July 2022 so approx ~2.5 years since any updates.
I am experiencing similar issue.
My Problem ->
Simulator only allows 18.3.1, and the app runs fine...the app runs fine on physical device on lower IOS version but crashes at 18.3.2 version immediately after login.
and the super weird thing is that, it crashes only if I login via apple, it doesnot crash on login with google and normal login, the app runs totally fine if I login via other mechanism.
Apple is rejecting my issue citing, app crashes after login ( ipad air ( 5th gen) 18.3.2 ). I think we have a same problem. do you also use signin with apple?
I have no idea how to fix this, spent days. please let me know if there is any fix.
I think this is due to apple's 18.3.2 new security patch.
"The hostname, if not specified or specified as '' or 'localhost', will default to a MySQL server running on the local machine using the default for the UNIX socket. To connect to a MySQL server on the local machine via TCP, you must specify the loopback IP address (127.0.0.1) as the host."
"To connect to a MySQL server on localhost using TCP/IP, you must specify the hostname as 127.0.0.1 (with the optional port)."
Did someone figured it out? I have the same problem right now and don't know how to solve it. Please help...
On a Mac concerning Docker Desktop, downloading the most current version from their website (since I didn't install via Homebrew) and running it worked. I did not need to uninstall or remove any files.
I can now answer my own question;
I disabled dev mode and things are now fine on localhost:3000. Well, at least I learnt a lot about the nuxt font module, which never was the issue.
Sorted.
".T"
Just sharing my frustation with this requirement for having Microsoft account. I was invited to join Azure DevOps for a project, I created a new Microsoft account as required, I do not want to use my existing microsoft account since it is for personal use. Then the new account got locked. I created 3 new Microsoft account, all got locked for suspicious activity.
To unlock you must have phone number in supported country. even with that still cannot unlock the microsoft account.
./run.sh runs the simulator
./build.sh ios_source builds for ios via xcode and android_source for android studio
Not sure how to open and use gui builder
I was having this problem with tensorflow-cpu version 2.19, I just installed pip install tensorflow_cpu==2.18 and it worked
Possible solutions:
Custom scope:
Select "Custom scope" in the "Generate Javadoc..." settings, you restrict Javadoc to processing only your project's source code. This isolates Javadoc from the conflicting external library dependencies.
"-exclude" flag:
The "-exclude" flag, when added to the "Command line arguments" field, provides control over which packages are excluded.
For example, "-exclude org.jetbrains.kotlinx.*" prevents Javadoc from processing any packages within the "org.jetbrains.kotlinx" namespace.
Use built in php function checkdate()
<?php
$date = explode('-','31-02-2025');
vardump(checkdate($date[1],$date[0],$date[2]);
/* bool(false) */
vardump(checkdate(2,28,2025));
/* 28-02-2025 => bool(true) */
Does anyone have an answer to this??
In my case there was a problem: my tsconfig.json was missing emitDecoratorMetadata: true
It needed auth method :token
`
def connection
@connection ||= Apnotic::Connection.new(
cert_path: StringIO.new(credentials.apns_key),
auth_method: :token,
team_id: credentials.team_id,
key_id: credentials.key_id,
bundle_identifier: credentials.bundle_identifier
)
end
Since Kotlin 2.0.20, if a Kotlin exported function uses a Set / List or Map type, you'll be able to handle those types from Javascript. See Kotlin documentation
I really like the rlang
functions to achieve this:
my_fun <- function(){
rlang::call_name(call = rlang::current_call())
}
my_fun()
#> [1] "my_fun"
Created on 2025-03-27 with reprex v2.1.1
i give you proper solution for this Issue :
minly this Error occure by caching issue so by following this step your problem resolve .
I find that
adb shell am start -n com.android.settings/.Settings
Is more helpful so I can dig down into all the settings.
I tried to perform this calculation using an 'and' function instead of multiple 'if' statements, and it does not work. Can anyone explain why it doesn't work?
{=PERCENTILE(
if(
and(
NUMBERVALUE(LEFT($A$1:$A$18,4))<=EndYear,
NUMBERVALUE(LEFT($A$1:$A$18,4))>=BegYear,
NUMBERVALUE(RIGHT($A$1:$A$18,1))<=EndMonth,
NUMBERVALUE(RIGHT($A$1:$A$18,1))>=BegMonth
),
$C$1:$C$18),.5)}
For me helped installing SoundFile
for Windows using pip install soundfile
Many thanks, Ivar.
Developer Tools > Network showed status: 200 (from cache).
I then did <three dots> > Settings > Privacy and security > Delete browsing data. Clicked "Cached images and files". Unclicked the others. And then clicked "Delete data".
It's good now. Thanks.
(Though I wonder how ltd.foo.com got into this state but sub1.foo.com did not. Not a big concern at the moment.)
try to Pull from a SharePoint people field, like the "Created By" or a custom person field.?
If one calculates the gradient (np.gradient) of the points you have, plot this data (gradient vs length along the line or arc length) and see where the moving average of the gradient begins to change - I picked moving average because this should account for noise (or you can use the appropriate filter Savitzky–Golay) - I know this is an OLD post
DESCRIBE PROCEDURE returns a result set with the column BODY, which contains the procedure body in the original ($$ notation) format.
I have been having the exact same issue as well!
The strange thing is that I had a script where it was plotting correctly, then I wrote a separate script where the issue began to occur then when I went back to the original script and ran it again to see how to fix it, the same issue had started to occur there also! So I am presuming that it was some modification to the package's configuration.... I don't know...
But I uninstalled and reinstalled again and the issue is still there....
Did you ever work out what was going wrong?
Right click your console application in Solution Explorer > Properties > then:
Use space to separate arguments and use "" brackets if you have space in the specific argument. Ex: "param 3"
Strangely enough, I am using https and I could fetch from console or in VSCode just fine.
What seems to have fixed it for me was to switch from Embedded Git to System Git.
There is no upper limit defined for size of JWT token. The JSON Web Token (JWT) standard (RFC 7519) does not specify a maximum token size. But this also depends on the usage as to where & how the JWT token is being used.
When used as Http header:
If the token is passed as bearer token in http header, many web servers do not allow this to be more than 8 KB. It's safe to keep it to 7 KB.
When JWT is stored in a cookie:
Browser usually supports cookies up to 4 KB, hence its better to keep that limit.
When storing in database:
We need to make sure the database column size is sufficient enough to house the token.
I have encountered the problem of using a large token & not able to use it to invoke the API either from Postman or from browser (While passing the JWT token in the http header). In that case, we had to find an alternate solution. We generated a pair of JWTs - one the full token and another a smaller version of the same but both containing same JTI value (A UUID unique to identify the token). The full token would contain lot of claims & the smaller one would contain minimal basic details. This full token would be store in Redis cache with JTI being the key. The UI will use the small token to invoke APIs. When the full token is needed on the backend, it would use the JTI received from the small token passed by UI & use the same to get the full token from Redis cache.
But, its suggested to keep the claims minimal to the need so that these kind of size related issues don't occur.
As you mentioned it works whenever you include the proper PR/Build url, it seems the problem is with the pattern you inserted.
You can try using this regex, for example, considering the changing data in the URL will be non numeric:
/https:\/\/my-site-[A-Za-z]+--pr[A-Za-z]+\.web\.app/g
As per the official GCP document :
“You can switch a Public NAT gateway from automatic NAT IP address allocation to manual NAT IP address assignment; however, the NAT IP addresses cannot be preserved. Even though automatically allocated NAT IP addresses are static, they cannot be moved to a manual NAT IP address assignment.”
Public NAT also automatically removes a NAT IP address when it no longer needs any source ports on that NAT IP address.
So it is not possible to reserve an ephemeral IP address used by Cloud NAT.
So try changing the IP to manual and keep your IP address, but you have the rework of changing your systems and communicating the changes to customers.
You can create a new issue or feature request Issue Tracker thread describing your issue.
The solution were to change the YValueType for the Series myChart1.Series[1].YValueType = ChartValueType.Int32;
to myChart1.Series[1].YValueType = ChartValueType.Double;
.
Now both minimum and maximum YValues on the chart, corresponds to the values set in the code.
myChart1.ChartAreas[0].AxisY.Minimum = 9.3;
myChart1.ChartAreas[0].AxisY.Maximum = 10.3;
Encryption of session-id and cookies prevents the user from modifying the values. By changing the content, a potential attacker can perform denial of service attacks by triggering pathological (worst case) algorithm complexity on data structures.
This issue is raised in Springboot Github and should be fixed in the future version SSL config does not watch for symlink file changes · Issue #44807 · spring-projects/spring-boot
Even though there's already an accepted answer, I thought this one was good to share.
const timeInSeconds = new Date.getTime() / 1000 | 0;
This works because bitwise operations on floats converts them to 32 bit integers; thus, truncating the decimal part.
Try these steps:
source .venv/bin/activate
pip list | grep openai-agents
If the package isn't listed, install it explicitly:
pip install openai-agents
from openai import OpenAI
# Or possibly
from openai.agents import ...
Your settings.json
has a placeholder pythonX.X
- make sure you've replaced this with your actual Python version (e.g., python3.10
).
The .env
file setup looks correct, but verify it's in the root directory of your project.
Try running your code directly from the terminal with the activated venv rather than through the IDE to isolate whether it's an IDE configuration issue.
If you're still having issues, check if the package has specific installation requirements:
pip show openai-agents
Could you share the exact import statement you're using and the full error message? That would help pinpoint the exact issue.
I know this is a late reply, but for anyone searching for workaround to this, just insert a blank Braille block character and in PowerBI it looks like a space. Just copy the space between the following quotes and paste it into your report and PowerBI shows a space. "⠀"
Link below to where I found this:
Is there an invisible character that is not regarded as whitespace?
I finally got it working by following this comment https://stackoverflow.com/a/43684021/24823862
below is the code for anyone who wants to use it
from M2Crypto import RSA
import binascii
import hashlib
import base64
# Data to hash
data = 'helloworld'
# Compute SHA-1 hash
sha1_hash = hashlib.sha1(data.encode('utf-8')).digest()
# Convert SHA-1 hash to hex string
sha1_hash_str = binascii.hexlify(sha1_hash).decode('utf-8')
# Read private key
private_key = RSA.load_key('key.pem')
# Encrypt the SHA-1 hash string using private key
ciphertext = private_key.private_encrypt(sha1_hash_str.encode('utf-8'), RSA.pkcs1_padding)
#encrypted_message = str(base64.b64encode(ciphertext), 'utf8')
encrypted_message = binascii.hexlify(ciphertext).decode('utf-8')
print(encrypted_message)
generated hash
01540a01e7c372f4d1395221ec90f68a0f4dbc123af9d032b768fd141c0b0a6420e0dcf903739dd2729cfdbf81bcd9512cc39ad4bd26239eab23069fdaf4e6fe
I got M2 crypto running in an ubuntu v22.04.3 LTS machine with python v3.10.12 and followed these instruction
M2 crypto repo - https://gitlab.com/m2crypto/m2crypto/-/tree/master
instructions to install - https://gitlab.com/m2crypto/m2crypto/-/blob/master/INSTALL.rst
Attaching two useful stackoverflow posts that helped me
Have you resolved this issue somehow ? I am facing exactly the same problem right now.
You can only sync once a day via system tab under parameter settings.Image of System Parameter Setting
@AjayKumar claims in the comment to their answer that this should now be resolved, but it still does not work for me.
This may be because our developer users are guest users from another Azure tenant, I don't know.
Regardless, here's an implementation that tests if the DefaultAzureCredential is a user, and only if it's a user, it uses Azure Resource Manager (ARM) to fetch the the ACS connection string. For this to work, your user must be Contributor on the whole subscription - but surely that's not a problem since you have a developer subscription for this sort of stuff, riight? (:
This solution uses the following packages:
The code:
using Azure;
using Azure.Communication.Email;
using Azure.Core;
using Azure.Identity;
using Azure.ResourceManager;
using Azure.ResourceManager.Communication;
using System.Net.Mail;
using System.Text;
using System.Text.Json;
namespace Your.Namespace.Here;
public sealed class AzureEmailer(AzureEmailer.Config config)
{
public sealed record Config(Uri AzureCommunicationServiceInstanceBaseUrl);
public async Task SendEmailAsync(
string from,
List<(string EmailAddress, string DisplayName)> to,
string subject,
string body,
bool isHtmlEmailBody = false,
List<(string Filename, string ContentType, BinaryData Data)>? attachments = null,
List<(string EmailAddress, string DisplayName)>? cc = null,
List<(string EmailAddress, string DisplayName)>? bcc = null
) {
var emailClient = await GetClientAsync(config.AzureCommunicationServiceInstanceBaseUrl);
EmailContent emailContent;
if(isHtmlEmailBody)
emailContent = new EmailContent(subject)
{
Html = body
};
else
emailContent = new EmailContent(subject)
{
PlainText = body
};
var recipients = new EmailRecipients(
to.Select(x => new EmailAddress(x.EmailAddress, x.DisplayName)),
cc?.Select(x => new EmailAddress(x.EmailAddress, x.DisplayName)),
bcc?.Select(x => new EmailAddress(x.EmailAddress, x.DisplayName))
);
var message = new EmailMessage(from, recipients, emailContent);
if(attachments is not null)
foreach(var attachment in attachments)
message.Attachments.Add(new EmailAttachment(attachment.Filename, attachment.ContentType, attachment.Data));
await emailClient.SendAsync(WaitUntil.Completed, message);
}
private static Dictionary<Uri, EmailClient> ClientCache { get; } = new();
private static SemaphoreSlim OneAtATime { get; } = new(1, 1);
private static async Task<EmailClient> GetClientAsync(Uri AcsBaseUrl)
{
await OneAtATime.WaitAsync();
try
{
if(ClientCache.TryGetValue(AcsBaseUrl, out var client))
return client;
EmailClient newClient;
var isUser = await IsAzureDefaultCredentialUserCredentials();
Console.WriteLine($"{nameof(AzureEmailer)} says \"IsUser == {isUser}\"");
if(isUser)
newClient = await GetClientFromAzureUserIdentityWorkaroundAsync(AcsBaseUrl);
else
newClient = new EmailClient(AcsBaseUrl, new DefaultAzureCredential());
ClientCache.Add(AcsBaseUrl, newClient);
return newClient;
}
finally
{
OneAtATime.Release();
}
}
private static async Task<bool> IsAzureDefaultCredentialUserCredentials()
{
try
{
var credential = new DefaultAzureCredential();
var context = new TokenRequestContext(new[] { "https://management.azure.com/.default" });
var token = await credential.GetTokenAsync(context);
var tokenParts = token.Token.Split('.');
//We don't actually know in this case because we couldn't parse the token.
//But the true-case is used for a workaround in dev, so it's better to make code that's inconvinient for dev
//yet still works in prod.
if(tokenParts.Length != 3)
return false;
var payload = tokenParts[1];
var payloadBase64Unpadded = payload.Replace('-', '+').Replace('_', '/');
var remainder = payloadBase64Unpadded.Length % 4;
string payloadbase64;
if(remainder == 0)
payloadbase64 = payloadBase64Unpadded;
//remainder == 1 should not possible because of how base64 works.
else if(remainder == 2)
payloadbase64 = payloadBase64Unpadded + "==";
else if(remainder == 3)
payloadbase64 = payloadBase64Unpadded + "=";
else
return false; //again, we couldn't parse the token. Better to just return false.
var json = Encoding.UTF8.GetString(Convert.FromBase64String(payloadbase64));
var claims = JsonDocument.Parse(json).RootElement;
return claims.TryGetProperty("idtyp", out var idType) && idType.GetString() == "user";
}
catch
{
//again, better to just break the developer-flow than break prod.
return false;
}
}
private static async Task<EmailClient> GetClientFromAzureUserIdentityWorkaroundAsync(Uri acsBaseUrl)
{
//There's a bug(?) in Azure Communication Serices, that means that if your default azure credential is based on a
//user, then you cannot use the normal managed identity flow like your managed identity applications can.
//This is despite how most (all?) other services that work with managed identity accept user identities fine.
//The following is an extremely rough hack to make the dev-experience require the same inputs as prod.
//For speed these requests could be parallelized since there's a bit of waiting time.
var armClient = new ArmClient(new DefaultAzureCredential());
await foreach(var tenant in armClient.GetTenants().GetAllAsync())
{
//For reasons we need to make an arm client for the specific tenant or
//we won't be able to get the list of subscriptions - we'd instead get an empty list.
var tenantArmClient = new ArmClient(new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
TenantId = tenant.Data.TenantId.ToString()
}));
await foreach(var subscription in tenantArmClient.GetSubscriptions().GetAllAsync())
{
try
{
await foreach(var acsInstance in subscription.GetCommunicationServiceResourcesAsync())
if(string.Equals(acsInstance.Data.HostName, acsBaseUrl.Host, StringComparison.OrdinalIgnoreCase))
{
var keys = await acsInstance.GetKeysAsync();
return new EmailClient(keys.Value.PrimaryConnectionString);
}
}
catch(RequestFailedException ex) when (ex.ErrorCode == "SubscriptionNotRegistered")
{
//This subscription is does not have permissions to use azure communication service,
//which means our target ACS instance won't be in this subscription anyway, so we'll just move on.
}
}
}
throw new Exception($"The requested Azure Communication Service instance with ({acsBaseUrl}) was not found.");
}
}
I was able to fix this by uninstalling the app from my android emulator and doing a fresh installation
after uninstalling the app, just
npm run android
https://github.com/miguelpruivo/flutter_file_picker/issues/1743 here you have the problem with that bug and his resolution, if you in your code sets the compressionQuality: 0, it works succesfully without duplicate images:
FilePickerResult? result = await FilePicker.platform.pickFiles(
allowMultiple: true,
type: FileType.image,
compressionQuality: 0
);
I was using CDK to deploy.
Unfortunately, when something changes, it was not forcing a deploy to Stage. So I had to manually deploy it.
You just need to overwrite the margin in the .cdk-drag-preview class:
.cdk-drag-preview {
margin-right: auto !important;
}
Check it on the stackblitz example.
I am still struggling to add structured data in my next js 15 with app router. can anyone please let me know how can I add page wise specific schema data? Based on next js doc and this post, I have used a js object and included in page main section using tag. But it is getting rendered in body and when I am trying to validate in schema validator tool it is unable to validate. Can you suggest a best solution?
I know this is an old question, but I stumbled upon it today looking for an answer myself. Here was my solution, which seems to work better than what was here provided. In my particular case, I want some ID's to be linked to each other. In my table I have a column lkMother which records the chain of events. Basically I want to record the price of a widget on such and such a date. Then I want to record the new price as another entry in the table, but I want to show that one of these items replaces the other. We may add several price changes. Then we get a notice from our supplier that they have come out with widget 2.0 to replace the original widget. Even though they aren't exactly the same thing, I want to record that they are, as far was are concerned a replacement.
So this is what I do.
Original Item receives ID Number
Some time later we receive a replacement. The replacement receives the Original Items ID number. The Original item is assigned the next available ID, but its Original ID is recorded in the lkMother column.
Repeat ad infinitem
So when a customer calls and says they need a replacement for a part they purchased on such and such a date, I look up the part, go to the lkMother column which gives me the current equivalent.
Here is stored procedure that swaps the IDs. Notice, this procedure requires that there be no item in the table with an ID = 1. It could be easily modified to fit another scenario.
'''
CREATE PROCEDURE `noe_pricebookswapids` (intid1 INT, intid2 INT)
BEGIN
SET SQL_SAFE_UPDATES = 0;
UPDATE `Order`.tblPrice
SET ID = 1 WHERE ID = intid1;
UPDATE `Order`.tblPrice
SET ID = intid1 WHERE ID = intid2;
UPDATE `Order`.tblPrice
SET ID = intid2 WHERE ID = 1;
END
Unfortunately, we cannot make the handshake asynchronous. However, a potential solution would be to delay the verification to a later stage (like in a Quarkus REST interceptor or an HTTP route). In these later stages, you can execute your blocking I/O on a worker thread.
It was very useful for me, like this:
For AL2 (Amazon Linux 2) Users
yum install gcc-c++ -y
Just select a specific .blueprint
and push the delete button on the keyboard. On your screenshot, you are selecting a whole changelist.
If you have an Apple device (MacBook, Mac Mini, etc.) you can make use of the --local
flag to build APK files locally on your device.
Example
eas build --profile [preview | development | production] --platform [ios | android] --local
You should use AR Foundation instead of Vuforia.
https://docs.unity3d.com/Packages/[email protected]/manual/index.html