I think QARetrievalChain and GraphCypherChain both output runnable classes that aren't directly compatible with standard LLM Chain nodes in Flowise.
Possible Solution
Try using a Custom Function node to merge the outputs from both RAG flows:
Create a Custom Function node that accepts both runnable outputs
Extract the actual response data from each runnable class using their respective methods (like .invoke() or .run())
Combine the responses in your custom logic
Pass the merged result to an LLM Chain node for final response generation
You should add the user and database option for the command pg_isready.
pg_isready -U myUser -d myDb
see the following post
The suggested solutions wouldn't work for me but this does
$('#grid').data("kendoGrid").cancelChanges();
Maybe update libheif
to a newer version.(I am using homebrew to do so on MacOS).
I am having this issue while installing another python library with pi-heif
as its dependency on MacOS. Same error about error: call to undeclared function 'heif_image_handle_get_preferred_decoding_colorspace'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
.
There was a 1.16.2
version of libheif
installed on my computer but the latest version available on homebrew as for now(2025-08-05) is 1.20.1
.
I just reinstall the latest version of libheif
and it's all right now.
In these compilers, the use of the -O
(or potentially -O2
, -Os
, -Oz
, or others, depending on use-case) compile option can be used to collapse identical switch statements.
I had the same issue after updating express
from version 4 to 5.1.0 while uploading files. I had to update body-parser
to the latest version (2.2.0).
As pre bitbucket official document there is a storage limit as well as expiry due also https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/ so if you need more then that use your own storage as amazon s3.
A combination of flutter_background_service and flutter_local_notifications will work.
Using this enables you to set custom time intervals.
i was also facing this issue for the past 2 months. atlast we changed to https .
today too when i checked issue still there but i tried many steps and,
Fixed the issue now.
steps.
remove your existing http request checking code in info.plist
add this
<key>NSAppTransportSecurity</key>
<dict>
<key>NSExceptionDomains</key>
<dict>
<key>NSExceptionAllowsInsecureHTTPLoads</key>
<true/>
<key>localhost</key>
<dict>
<key>NSExceptionAllowsInsecureHTTPLoads</key>
<true/>
</dict>
</dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
then Xcode -> product -> Clean Build folder
then run the app in your iOS device.
Now we can run your http urls in your ios as you needed .
Have you tried this?
[views]="['day', 'week', 'workWeek', 'month']"
//You can set it as the default.
currentView="workWeek"
I've chanced upon a similar issue and I'd like to extend the problem described above hoping to find some enlightenment.
In the same website linked above, there is a checkbox option (or a form input) for "Also find historicised data" to obtain a full history of the dataset. Upon inspection of the html element and checking out the codes above, this leads to a POST
to https://www.bundesanzeiger.de/pub/en/nlp?0-1.-nlp\~filter\~form\~panel-form with a payload of form inputs.
payload = {
"fulltext": None,
"positionsinhaber": None,
"ermittent": None,
"isin": None,
"positionVon": None,
"positionBis": None,
"datumVon": None,
"datumBis": None,
"isHistorical": "true",
"nlp-search-button": "Search net short positions"
}
Below, I'm using a modified version of Andre's code to POST
with isHistorical=true
, followed by doing a GET
of the original download link does seems to only return the default result (i.e. the non-historised dataset). I'm not too sure if there is something i might be missing here and would appreciate someone to take a look at this. Thanks!
import requests
def net_short_positions():
url = "https://www.bundesanzeiger.de/pub/en/nlp?0--top~csv~form~panel-form-csv~resource~link"
headers = {
"User-Agent": "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/118.0",
"Referer": "https://www.bundesanzeiger.de/",
}
payload = {
"fulltext": None,
"positionsinhaber": None,
"ermittent": None,
"isin": None,
"positionVon": None,
"positionBis": None,
"datumVon": None,
"datumBis": None,
"isHistorical": "true",
"nlp-search-button": "Search net short positions"
}
with requests.session() as s:
s.headers.update(headers)
s.get("https://www.bundesanzeiger.de/pub/en/nlp?0")
s.post("https://www.bundesanzeiger.de/pub/en/nlp?0-1.-nlp~filter~form~panel-form", data=payload, headers=headers, allow_redirects=False)
return s.get(url).content
descargar magis tv es una plataforma innovadora que ofrece acceso a una amplia variedad de canales de televisión en vivo, series, películas y contenido premium, todo desde una sola aplicación. Ideal para usuarios que buscan entretenimiento sin límites, MagisTV garantiza calidad, estabilidad y una experiencia de usuario intuitiva. Optimizada para buscadores y asistentes virtuales, esta solución se adapta perfectamente a diferentes regiones y dispositivos, posicionándose como una de las mejores opciones en el mercado del streaming digital.
add this two line after plt.close()
from IPython.display import HTML
HTML(ani.to_html5_video())
After the question was asked, VS Code got many commits & Versions. Then some answers become unusable.
The Job:
My default browser is Edge, and I want to set "Chrome as the default browser for VS Code".
Current VS Code date is 2025-07-29
Go To File>Preferences>Settings
Then for the User go to Workbench>Scroll down and find External Browser
If you want to set different browser for a Workspace then select WorkSpace tab and Workbench group> scroll down and find External Browser
You can restart VS Code to check
Even though they’re in the same region, by default their communication will still be done through the public internet which may result in higher latency and data transfer fees.
To avoid these inconveniences you need to enable traffic to use private IPs over AWS internal network by setting up VPC Peering.
You can learn how to enable VPC Peering in the official documentation: https://redis.io/docs/latest/operate/rc/security/vpc-peering/
Sorry, there is no backport. This was a deep change to the language grammar.
TypeError: crypto.hash is not a function
I was on Node.js v20, and the error persisted. After upgrading to v22, the error was resolved, and npm run dev
worked as expected.
vite
node.js
vitejs
crypto.hash
vite dev server
node-22
I’m the developer of Transcribe Audio to Text Chrome extension that performs audio-to-text transcription using Whisper AI. I’m currently working on an update where I experimented heavily with streaming transcription and different architectural setups
In my experience, achieving true real-time transcription using Whisper API is not really feasible at the moment — especially when you’re aiming for coherent, context-aware output. Whisper processes chunks holistically, and when forced into a pseudo-streaming mode (e.g., with very short segments), it loses context and the resulting transcription tends to be fragmented or semantically broken
After multiple experiments, I ended up implementing a slight delay between recording and transcription. Instead of true live streaming, I batch short audio chunks, then process them with Whisper. This delay is small enough to feel responsive, but large enough to preserve context and greatly improve output quality.
For mobile or React Native scenarios, you might consider this hybrid model: record short buffered segments, then send them asynchronously for transcription. It won’t be word-by-word real-time, but it offers a much better balance between speed and linguistic quality
I was running into this issue myself! What I ended up doing was to add the shared folder to the tailwind.config.js
in the content setting. For instance, my config has this line
content: ['./src/**/*.{js,jsx,ts,tsx}', './public/index.html', '../shared/**/*.{js,jsx,ts,tsx}']
where all of my common components are in the shared
folder/workspace.
Did you find how to solve it? I'm experiencing the same issue with SceneView on Android which uses 1.56 filament internally. Opened an issue in SceneView repo, but maybe its related to Filament.
https://github.com/SceneView/sceneview-android/issues/624#issue-3267444062
Thank you @ChristianStieber I had completely missed that I was using a constructor with parameters for the default constructor. :(
class A {
public:
A() : obj(nullptr) {}
A(int*& obj_) : obj(obj_) {}
protected:
int* obj;
};
class B : public A {
public:
B() : A() {}
B(int*& obj_) : A(obj_) {}
};
<!-- prettier-ignore -->
<script type="module" src="https://unpkg.com/[email protected]/dist/ionicons/ionicons.esm.js"></script>
<!-- prettier-ignore -->
<script nomodule src="https://unpkg.com/[email protected]/dist/ionicons/ionicons.js"></script>
An SDK update during the npm start step got the recommended SDK version which fixed the process getting stuck in the Expo screen.
This method fixed my issue.
Deleted node_modules folder and redid the npm install.
Ran npm start
While running the npm start command was prompted for,
Expo Go 2.33.20 is recommended for SDK 53.0.0 (MePh is using null). Learn more: https://docs.expo.dev/get-started/expo-go/#sdk-versions. Install the recommended Expo Go version? ... yes
Uninstalling Expo Go from android device MePh.
Downloading the Expo Go app [================================================================] 100% 0.0s
Copy dll to *.exe
folder help fix issue.
reference, https://github.com/pyinstaller/pyinstaller/issues/4935
Figured it out, turns out adblockplus was blocking that particular div for some reason (maybe because it had a linkedin link?). Once I turned it off it shows as normal. Thank yall for your help on narrowing it down.
not running the code below.
conn = pymysql.connect(host="XX.mysql.pythonanywhere-services.com", user="XX", password="Any", database="XX$MyDB")
Pinecone filters need to use comparison operators. For exact matches, use the $eq
operator:
const filter = {
info: { $eq: state.info }
};
met too, what should I do? I have installed pinentry-mac
Have you try FormatCurrency sample like this:
lstOutput.Items.Add("Net pay: " & FormatCurrency(txtNetPay, 2, True))
toml
is Gradle version catalogs
, see
In settings.gradle
there is
repositories {
google()
it read dependency from Google Maven repository
The 8.0.19
runtime is already on NuGet, but it might take a little longer until it's fully picked up by GitHub Actions or your environment.
Just give it a bit more time – it should resolve itself shortly.
https://www.nuget.org/packages/Microsoft.NETCore.App.Runtime.win-x64/8.0.19
All Indian guys should be working at a 7/11 or a corner store!!!!
example with source code you can see https://github.com/usermicrodevices/prod-flet
application supports all major operating systems: Linux, Windows, MacOS, iOS, Android. The graphical user interface is built on the principle of filling the customer's basket, but in reality it is a cash register software that performs all mathematical calculations of the POS terminal. Product search is possible in the mode of typing from the keyboard or scanning with a connected scanner. The mobile version also contains a built-in barcode and QR code scanner - so it is possible to use a phone or tablet as a data collection terminal. After synchronizing the directories with the server, the application can work autonomously. Synchronization of directories is configured in selected time intervals. Sending sales and order data to the server can also be delayed in case of network breaks and is configured in selected time intervals. Trading equipment can be integrated through device drivers. An example of connecting trade scales can be found in the open source project. Banking and fiscal equipment can also be integrated through the appropriate drivers from manufacturers. Entering balances to control stocks (inventory management) in a warehouse or retail outlet from the application is done through an order and in the server admin panel it is changed to the appropriate document type with the appropriate counterparty.
The advantage of OneHotEncoder is that it remembers which categories it was trained on. This is very important because once your model is in production, it should be fed exactly the same features as during training: no more no less.
I use python 3.10. HTTPError is no good. Use from urllib.error import URLError
from urllib.error import URLError
try:
urlopen('http://url404.com')
except URLError as e:
print(e.code)
If I specify the target dns server it works, thanks to another AI bot
http://10.0.1.192:9115/probe?module=dns&target=8.8.8.8&debug=true
Check CSS for correct font-weight
values and ensure the font variant is loaded—just like optimizing Roblox APK for best visuals!
Improve regex pattern gsub("\\s*\\([^)]*\\)", "", ...)
it captures any text in parentheses regardless of content
Add the option .withoutEscapingSlashes
to the JSONSerialization
.
les dejo la solucion que me funciono a mi, espero sea de ayuda y de paso les dejo un link para que visiten mi pagina, saludo grande a todos (https://programacion365.com/):
el codigo de mi clase principal es el siguiente:
package com.example.spring_email;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
@SpringBootApplication
public class SpringEmailApplication {
public static void main(String[] args) {
SpringApplication.run(SpringEmailApplication.class, args);
}
@Bean
CommandLineRunner runner(EmailService emailService) {
return args -> {
String destinatarios = "[email protected],[email protected]";
String asunto = "Correo de prueba";
String cuerpo = "<h1>Hola a todos</h1><p>Este es un mensaje para múltiples destinatarios.</p>";
emailService.enviarCorreo(destinatarios, asunto, cuerpo);
};
}
}
creamos un controlador con el siguiente codigo:
package com.example.spring_email.controladores;
import com.example.spring_email.EmailService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.*;
@Controller
public class EmailController {
@Autowired
private EmailService emailService;
@GetMapping("/formulario")
public String mostrarFormulario() {
return "email_form";
}
@PostMapping("/enviar")
public String enviarCorreo(
@RequestParam("destinatarios") String destinatarios,
@RequestParam("asunto") String asunto,
@RequestParam("cuerpo") String cuerpo,
Model model) {
try {
emailService.enviarCorreo(destinatarios, asunto, cuerpo);
model.addAttribute("mensaje", "Correo enviado exitosamente.");
} catch (Exception e) {
model.addAttribute("mensaje", "Error al enviar el correo: " + e.getMessage());
}
return "email_form";
}
}
tambien creamos un servicio:
package com.example.spring_email;
import jakarta.mail.internet.MimeMessage;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.ClassPathResource;
import org.springframework.mail.javamail.JavaMailSender;
import org.springframework.mail.javamail.MimeMessageHelper;
import org.springframework.stereotype.Service;
@Service
public class EmailService {
@Autowired
private JavaMailSender mailSender;
public void enviarCorreo(String destinatarios, String asunto, String mensajeHtml) throws Exception {
MimeMessage mensaje = mailSender.createMimeMessage();
MimeMessageHelper helper = new MimeMessageHelper(mensaje, true);
String[] destinatariosArray = destinatarios.split("[;,]");
helper.setTo(destinatariosArray);
helper.setSubject(asunto);
// Agregar contenido HTML con firma
String htmlConFirma = mensajeHtml +
"<br><br><img src='cid:firmaImagen' alt='Firma' width='200'/>";
helper.setText(htmlConFirma, true);
// Cargar imagen desde resources
ClassPathResource imagen = new ClassPathResource("static/images/logo.png");
helper.addInline("firmaImagen", imagen);
mailSender.send(mensaje);
}
}
y por ultimo creamos el formulario de envio en html:
<!DOCTYPE html>
<html xmlns:th="http://www.thymeleaf.org">
<head>
<title>Enviar Correo</title>
<meta charset="UTF-8">
</head>
<body>
<h1>Enviar correo a múltiples destinatarios</h1>
<form th:action="@{/enviar}" method="post">
<label>Destinatarios (separados por coma):</label><br>
<input type="text" name="destinatarios" style="width: 400px;" required><br><br>
<label>Asunto:</label><br>
<input type="text" name="asunto" style="width: 400px;" required><br><br>
<label>Cuerpo del mensaje:</label><br>
<textarea name="cuerpo" rows="10" cols="50" required></textarea><br><br>
<button type="submit">Enviar</button>
</form>
<p th:text="${mensaje}" style="color: green;"></p>
</body>
</html>
@AdamNagy When I enable the largeModelExperienceBeta: true setting or disable the Large Model Environment, I encounter the following error in the console:
Consolidation failed. TypeError: event.model.getData is not a function
at MultipleModelUtil.js:143:85
at Array.findIndex (<anonymous>)
at cP.progressUpdated (MultipleModelUtil.js:143:43)
at cP.dispatchEvent (EventDispatcher.js:154:41)
at DT.signalProgress (Viewer3DImpl.js:3033:18)
at gS.removeJob (RenderModel.js:234:40)
at RenderModel.js:1331:25
at <anonymous>
at async lE.onGeomLoadDone (OtgLoader.js:1400:55)
Due to this error, I suspect that the 'Large Model Experience' setting may not be properly applied.
I searched and finally found a similar post:
Debug a Python C/C++ Pybind11 extension in VSCode [Linux]
"Upon reaching your binded code in Python, you may have to click manually in the call stack (in the debug panel on the left) to actually switch into the C++ code."
From the above description, it seems that we need to click the call stack to switch to C++ code manually. So what I encountered is normal.
It seems you are giving a tuple as a parameter and the function is expecting key word arguments.
You can try something like the following unpacking the values.
params = {
'0': 'bitcoin',
'1': 115251,
'2': '2025-08-04T18:30:40.926246+00:00',
'3': 'Monday'
}
conn.run(INSERT_SQL, **params)
You can check this library: https://pypi.org/project/sklearn-migrator/
It is very useful to migrate models of scikit learn
Changing the package to main.java
did the trick.
In the java class that is:
package main.java;
Whether the import statement is updated or not doesn't matter, both work:
(:import [main ModelLoader])
and
(:import [main.java ModelLoader])
Why? I wouldn't mind a more in in-depth explanation myself.
Now with a working package the class visibility pointed to by @Eugene actually makes a difference: the class needs to be public. Otherwise the same error persists.
I would update all the second jump things (ex. Increase y Movement, change move time) to try to make it look better. I would also consider making a small fall before the second jump
گاهی یه هدیه، میتونه پناهِ لحظههای سخت باشه؛
اما از اون مهمتر، فکریه که پشت اون هدیه بوده.
این کوچولوی ناز، یادآور مهربونی کسیه که خوب میدونه "آرامش" یعنی چی...
و مهمتر از اون، میدونه آرامش واسه "نوشین"، تو همین لحظههای بیادعا و سادهست؛ نه پرزرقوبرق، نه شلوغ...
خیلی خوشحالم که هنوز این حس درونم زندهست؛ که میتونم با همین سادگی پرمعنا، بینهایت لذت ببرم و ذوق کنم. 🤍
If you want to display just html, css, images just use any templating lang like ejs
, pug
and if just want the user to download them when visit just add them to archive you can simply use something like JSZip
any other lib to make you archive then add required headers like content-disposition and others to make user download it
I have similar issue. Entities not known. The trick is that HA does not recognize those 'switch' entities. It works perfectly via node-red though.
I was wondering if you have the same ADVANCED settings inside VirCOM
one day! i just dont want to have nooooo problems at all. spent 2 hours on this shyte and still no solution whats wrong with Windows 11 and all of that java nonse! i just want to turn my pc off, get on my bike and go to somewhere on a mountain and live there in piece!!! fed up with all this tech jargons i really did.
One thing that's often overlooked in these discussions is that the VTT isnt just about constructing subobjects correctly its about simulating the illusion of identity for each base during phased construction. The virtual VTT entries exist primarily to support downcasting and method dispatch in contexts where the derived object is not yet fully formed but the language semantics require 'as-if' behavior. This is particularly relevant when you have virtual inheritance across multiple independent branches of the hierarchy that must reconcile shared virtual bases. You're essentially encoding partial object perspectives with temporary vptr adjustments, allowing each base to 'believe' its the most derived type during its own construction.
from docx import Document
# Criar o documento Word
doc = Document()
# Adicionar conteúdo
doc.add_heading('GILVAN DOS SANTOS DO NASCIMENTO', level=1)
doc.add_paragraph('Nova Iguaçu – RJ')
doc.add_paragraph('Telefone: (21) 97099-8932')
doc.add_paragraph('E-mail: [email protected]')
doc.add_paragraph('Idade: 17 anos')
doc.add_paragraph('')
doc.add_heading('Objetivo', level=2)
doc.add_paragraph(
'Ingressar no programa de Jovem Aprendiz, com o objetivo de adquirir experiência profissional, '
'desenvolver habilidades e contribuir positivamente com a equipe da empresa.'
)
doc.add_paragraph('')
doc.add_heading('Formação Acadêmica', level=2)
doc.add_paragraph('Ensino Médio – 2ª série (em andamento)')
doc.add_paragraph('Previsão de conclusão: 2026')
doc.add_paragraph('')
doc.add_heading('Cursos Complementares', level=2)
doc.add_paragraph('Informática Básica')
doc.add_paragraph('')
doc.add_heading('Experiência Profissional', level=2)
doc.add_paragraph(
'Ainda não possuo experiência profissional formal, mas estou em busca da minha primeira oportunidade '
'no mercado de trabalho para aprender e crescer profissionalmente.'
)
doc.add_paragraph('')
doc.add_heading('Habilidades e Competências', level=2)
doc.add_paragraph('- Facilidade de aprendizado')
doc.add_paragraph('- Boa comunicação')
doc.add_paragraph('- Responsabilidade e pontualidade')
doc.add_paragraph('- Trabalho em equipe')
doc.add_paragraph('- Conhecimento básico em informática (Word, Excel, Internet)')
doc.add_paragraph('')
doc.add_heading('Informações Adicionais', level=2)
doc.add_paragraph('- Disponibilidade: Tarde')
doc.add_paragraph(
'- Interesse em atuar nas áreas de: administração, atendimento ao cliente, estoque ou auxiliar de escritório'
)
# Salvar o documento
word_file_path = "/mnt/data/Curriculo_Gilvan_Dos_Santos.docx"
doc.save(word_file_path)
word_file_path
This is basically the java error. You must install the jdk by first activating the environment
conda activate <env_name>
conda install openjdk
pip install findspark
rebuild the spark session and I think this should solve the problem.
For Ignition:
Use
IGNITION_EDITOR=vscode-remote
IGNITION_REMOTE_SITES_PATH=/home/paul
IGNITION_LOCAL_SITES_PATH=wsl+Ubuntu/home/paul
For Ray:
Use Custom URL Editor Preference with vscode://vscode-remote/wsl+Ubuntu/%path:%line
Do not configure RAY_LOCAL_PATH
in .env
This problem still occurs - and is apparently due to CPAN not bothering to check whether a directory exists or not before attempting to write to it.
Ran into this running cpan for the first time on a new host as 'root' - needless to say being told I didn't have write permissions was a bit of a head scratcher.
Turns out, if any directory listed in @INC does not exist, this error is the result - it is interpreting any failure as permission denied.
Creating the missing directory made the error go away.
... updated this because this is the first result back from a Google of the error message, so might as well make it pay off for the next poor shlub that runs into this. Seven years and CPAN hasn't bothered to find or fix it.
Instead of finding the admin password, why don't you just change the password with one line CLI?
docker exec -it $(docker ps --filter name=uwsgi --format {{.ID}}) ./manage.py shell -c 'from django.contrib.auth.models import User; u = User.objects.get(username="admin"); u.set_password("Dojo1234!"); u.save()'
Not sure that xpath is available in the text editor. I belive it must be a valid JS selector.
This is working for me:
a[Text=">"]
yes and its available here: https://laravel.com/docs/12.x/queries#additional-where-clauses
But for your case you can simply do
->when($filter, function ($query, $filter) {
collect($filter)->each(function ($value) use ($query) {
$query->whereIn('products.value', explode(',', $value));
});
})
this is cleaner
Thanks to @Nassau for pointing out the correct CHS values using sfdisk -g
. I had been using BPB values for calculating CHS, which led me to load the wrong sector.
Correcting the seek=
in dd
to match C:0 H:4 S:38
(i.e., sector 307) fixed the problem.
Cambia los tipos de som
y rotina
de byte
a byte[]
en tu clase Mesa
private byte[] som;
private byte[] rotina;
private byte[] address64Bits;
Y usa pst.setBytes(...)
en lugar de pst.setByte(...)
pst.setBytes(6, mesa.getSom());
pst.setBytes(7, mesa.getRotina());
pst.setBytes(8, mesa.getAddress64Bits());
I think the issue could be pylance
. I have once heard the issue could be with pylance. Disabling pylance
Canceling this question. Turns out this was a strange side effect of a little hack that I did years ago to get Efcore to play nicely with snowflake. In order to get it to work I needed to re-set the connection string, which caused a bad state where the snowflake code maxed out on the original connection string's pool but when it went looking for idle connections couldn't find any because the string had changed.
Change the variables som
and rotina
to byte[]
and then use pst.setBytes()
instead of pst.setByte()
isosynetic might serve your purpose. It works for me in language acquisition studies, " Pointing to something and referring to it are isosynetic communication strategies."
Is my angle computation logic correct? Should I normalize angles differently?
As @ravenspoint and @btilly pointed out, calculating precise angles with atan
can be prone to floating-point errors in this case, we could compare coordinates directly to check for horizontal, vertical, or 45° diagonal lines.
Given a
starting point) at (x1, y1)
nd a catcher (target point) at (x2, y2)
:
↑ North : x2 == x1
and y2 > y1
→ East: y2 == y1
and x2 > x1
↓ South: x2 == x1
and y2 < y1
← West: y2 == y1
and x2 < x1
(x2 - x1)
is equal to (y2 - y1)
, it means "run" (horizontal distance) is equal "rise" (vertical distance).
↗ North East: (x2 - x1) = (y2 - y1)
and x2 > x1
↙ South West: (x2 - x1) = (y2 - y1)
and x2 < x1
↖ North West: (x2 - x1) = -(y2 - y1)
and x2 < x1
↘ South East: (x2 - x1) = -(y2 - y1)
and x2 > x1
A line with a slope of 1 is a 45° diagonal.
Is my method for selecting the closest player correct? How can I verify it?
Calculating Euclidean distance should be enough.
Is there a more robust way to handle player rotation?
Clockwise rotation looks good.
----
Your code has a few logical issues:
The logic appears to check only the thrower initial direction and stops the turn if no catcher is there on the initial direction
The catcher's new throwing direction seems to be a rotation of the previous thrower's direction. The catcher's new orientation should be the opposite of the direction they received the ball from
The simulation does not seem to remove a player from the field after they have thrown the ball
Here are the animations that demonstrate this logic:
int[] marks = {78, 85, 90, 60, 99};
int maxNum = 0;
for (int i = 0; i < marks.length; i++){
if (maxNum< marks[i]){
maxNum = marks[i];
}
}
System.out.println(maxNum);
Nice concept! Just add sys.exit()
to quit cleanly. Also, that NameError can be fixed by passing search_area
to badResponse()
. Btw, I’ve seen similar beginner-friendly games shared on platforms like Spike great for testing mods and tweaks!
For requests that may take a long time, consider using a callback mechanism.
Generate a unique identifier for the request (e.g., UUIDv4) and store it (e.g., in a database) along with a mapping to the file you're downloading from the other endpoint.
Return a callback URL to the client that includes this unique ID.
The client can then poll the callback URL to check if the file is ready.
If the client checks too early, return a response indicating that the file is not ready yet.
If the client never accesses the callback URL, make sure to implement a cleanup mechanism to remove unused data.
open ui-grid-js or ui-grid-min.js
gridMenu: {
aria: { buttonLabel: 'Grid Menu' }, columns: 'Columns:', importerTitle: 'Import file',
exporterAllAsPdf: 'Export todo como pdf',
exporterVisibleAsPdf: 'Export visible data as pdf',
exporterSelectedAsPdf: 'Export selected data as pdf',
exporterAllAsExcel: 'Exportar todo a excel',
exporterVisibleAsExcel: 'Exportar solo lo visible a Excel',
exporterSelectedAsExcel: 'Export selected data as excel',
clearAllFilters: 'Clear all filters' },
This is confusing. Is the issue related to Cursor or VSCode? We don't need to know that you're aware of Cursor being a fork of VSCode. Check your plugins this may be a side effect.
You need to wrap the value in an object with the $eq (equals) operator.
const filter = {
info: { "$eq": state.info },
};
In case you get to this page, try to use timeouts on your workflows as a safety gate
Late Delay. I appreciate the responses as was a typical RFM failure on my part. Using the module qualified name was super helpful. This helps a lot going forward with other modules. Credit to both responders; not sure how to reward points or anything but if they are available they are given.
Enable AndroidX in gradle.properties
android.useAndroidX=true
android.enableJetifier=true
does updateAge extend the expiry time of maxAge ? when using jwt as session strategy
from moviepy.editor import concatenate_videoclips
# Criar um clipe de 3 segundos com a imagem como fundo
intro_clip = ImageClip(image_path).set_duration(3).resize(video.size)
# Concatenar a introdução com o vídeo original
final_video = concatenate_videoclips([intro_clip, video])
# Exportar o novo vídeo com a introdução
final_video.write_videofile("/mnt/data/video_com_intro_vila_ede.mp4", codec="libx264", audio_codec="aac")
Chainsaw Man is raw, violent, and emotionally gripping. It tells the story of Denji, a devil hunter with the power of a chainsaw, fighting through a brutal world of devils, betrayal, and surreal chaos.
I ran into this exact issue where .css
files were being served with the text/plain
content type, and the browser was refusing to apply the styles.
I went through a long list of troubleshooting steps:
Made sure the mime.types
file was included in my Nginx config
Verified that .css
was correctly mapped to text/css
Tried different location
blocks
Double-checked the file paths
Even reinstalled Nginx at some point
Still, no luck — Chrome kept showing Content-Type: text/plain
in the Network tab, and the styles just wouldn't apply.
After some frustration, I noticed that the network request in Chrome had been cached, and the cached response had the wrong content type. Here's what worked:
I disabled caching in Chrome DevTools (Network tab → "Disable cache"), refreshed the page — and suddenly the CSS was loading correctly with the right
text/css
content type.
So in my case, it was a caching issue, and all the correct configurations were being ignored because the browser was holding onto an old, incorrect response.
The first one can be fixed by adding this to your settings:
"chat.tools.autoApprove": true
Still trying to figure out how to disable the second option.
Instructions for updating Jenkins to Java 21: https://www.jenkins.io/doc/book/platform-information/upgrade-java-to-21/
Did you miss this step?
Upgrade to Java 21 on your Jenkins controller
Stop the Jenkins controller with systemctl stop jenkins.
Install the corresponding Java version with dnf -y install temurin-21-jdk or with the package manager your system uses.
Check the Java version with java -version.
Change the default Java for the system by running update-alternatives --config java and then enter the number that corresponds to Java 21, for example 2 if that is the correct option.
Restart Jenkins with systemctl restart jenkins.
From what I know, it is possible to use the "dynamic import" feature in all JS files, regardless of them being modules or workers or whatever. The syntax for the dynamic import can be found here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import
This is a known issue. Ad blockers and privacy-focused browsers block scripts like Google Analytics and Google Tag Manager by using filter lists based on domains, JavaScript patterns, and URL paths (e.g. googletagmanager.com
, /collect
, etc.).
One solution recommended across the internet is server-side GTM (ssGTM), which claims to reduce the impact on analytics and marketing tools from ad blocker. But nowadays even first-party ssGTM setups get blocked by URL patterns, and if not ssGTM itself (for instance, if it uses some custom load script), then other products used by GTM are blocked. I even wrote an article explaining why server-side GTM isn’t a complete solution against ad blockers.
A more robust approach would be the network-level protection.
To prevent GA or GTM from being blocked you can route requests through a protected proxy channel that masks both the destination and the structure of the calls. This approach remains fully compliant – as long as you aggregate data properly and ensure user consent.
DataUnlocker solves this by handling technical blocking at the network level. It reroutes tracking requests through a customizable endpoint that avoids detection by ad blockers, and moreover ensure this channel can't be affected and compromised, long-term.
Disclaimer: I’m the founder of DataUnlocker. I’m adding this here because this thread still shows up in search results and I hope this explanation helps others facing similar issues.
DataUnlocker can be integrated into any web app and complement all existing marketing products you have there. A few steps to install it and your tools are protected:
Result (with ad blocker enabled):
I’m the developer of this Chrome extension that transcribes audio using Whisper AI. I’m currently working on an update focused specifically on real-time transcription
After testing various approaches, I found that true streaming with Whisper isn’t yet possible in the strict sense (as Whisper requires full chunks of audio). However, the most reliable solution I’ve implemented is processing 15-second audio blocks in near-real-time. This allows the app to simulate streaming with acceptable latency and stable transcription quality
I ran several experiments and found that: • Shorter blocks (e.g., 5–10 sec) often lead to poor language model context and lower accuracy. • Longer blocks increase latency and risk losing responsiveness. • 15 seconds strikes the best balance between speed and transcription quality.
So if you’re looking to simulate real-time transcription using Whisper’s API, slicing the input into 15s segments and processing each one as it completes is currently the most practical method
it's render bug in cloin and this type of comments are not supported in other IDE as @Friedrich point, so you need make decision if it's worth it or not but it's easy to remove it anyway (just find replace // TIP
to //
there is a lot you can do with comment like TODO which fairly supported in most IDE
you may interest do doc with comments like doxygen
When you run:
$token = (Get-AzAccessToken -ResourceUrl $baseUrl).Token: This uses your currently logged-in Azure Session (via Connect-AzAccount) to generate a token. That token works fine in Postman or manual Powershell.
In the CICD Context, there's likely no interactive login session, the Get-AzAccessToken might not have a valid token context or it generates a token that's not valid for the resource you're querying, or the service prioncipal or managed Identity being used in the pipeline lacks required permissions to call the EvolvedSecurityTokenService, which handles the ARM tokens.
please try using Connect-AzAccount explicitly with service principal if you are running with pipeline
$securePassword = ConvertTo-SecureString $env:AZURE_CLIENT_SECRET -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($env:AZURE_CLIENT_ID, $securePassword)
Connect-AzAccount -ServicePrincipal -Credential $credential -Tenant $env:AZURE_TENANT_ID
Then retry:
$token = (Get-AzAccessToken -ResourceUrl "https://management.azure.com").Token
And please make sure that the resourceGroup where $queryPackName exists has a Contributer Access.
In case if your pipeline is uses az login, please try:
$token = az account get-access-token --resource=https://management.azure.com --query accessToken -o tsv
You can try this Gaussian elimination online calculator
What do you need a border for anyway? To keep the Mexican proletariat out?
The reason is here:
So it first gets converted into a list and then it gets converted to an array.
With current WASM implementations, you could write your WASM modules such that their execution state is fully specified by exported globals, tables, and the contents of linear memory, and save all of that. But there's no external means to force access to all the needed information if you don't make the module export everything relevant, and refrain from using any data that is not externally visible.
However, you could write your own WASM runtime, and ensure that the data structures it uses to represent the instantiated module state are all reversibly serializable. You could even do so in JavaScript and WASM, if you want it to run in a browser, but you'd obviously have a significant emulation slowdown.
Just make sure you successfully completed all steps from React Native iOS Messaging Setup documentation.
The maven shade plugin has a minimizeJar option.
This will:
See https://headcrashing.wordpress.com/2020/02/23/reducing-jar-footprint-with-maven-shade-plugin-3-2-2/
So the problem was with trailing *. I am not sure why, but having * within the URL (www..*.amazonaws.com) is fine, but (www.example.com/\**) is treated literally, so matches the exact URl with * at the end
I'm currently doing some modifications on a website where I created a banner that has a button to open a "mailto:" link, and I managed to customize it so when it opens the mail client window, it already has the desired info on recipient, subject and body fields.
It's a Wordpress site, built with Elementor and I used the Royal Elementor Addons plugin to create a popup banner, but when I added the "mailto:" link, none of the HTML tags I tried to use worked (I wanted to add paragraphs, bold and underlines). After reasearching a bit further I found out that there are specific tags for these type of links and I managed to add the paragraphs on the mail body, like this (without having to use "<a href>" and similar tags):
mailto:[email protected]?subject="Novo Cliente JCR iTech"&body="Dados para Inscrição%0A%0ANome:%0ATelefone:%0ATipo de serviço/compra:%0A%0A"
But I'm having a hard time in discovering how to add bold or underline on the body text (if it's even possible). I read this article but it seems that the extra work doesn't deserve the effort (I want to keep it simple), but still I just want to ask if anyone knows a simpler method to achieve this.
Thanks in advance.
In my case it was a local SPM package dependency that was missing but referenced by another local spm package.
Open local Powershell profile in Notepad:
notepad $PROFILE.CurrentUserCurrentHost
Add following line in your profile file, save, close it.
$PSDefaultParameterValues['Invoke-Sqlcmd:Encrypt'] = 'Optional'
Note: Create Powershell profile if not already existing:
if (-not (Test-Path $PROFILE.CurrentUserCurrentHost)) {
New-Item -Path $PROFILE.CurrentUserCurrentHost -ItemType File -Force
}
I found a way to do what I needed.
I first created a cell with a dropdown list of the section headers in cell A2. This was not strightforward, because Google Sheets kept transforming the whole column into a dropdown column, which I didn't want. So I created the cell with the dropdown list in a temporary sheet and then I copied the cell, went to the cell where I needed the dropdown list and pasted only the data validation.
Next I used this formula in cell A3:
=HYPERLINK(
"#gid=12345678range=" &SUBSTITUTE(
SUBSTITUTE(
cell("address",xlookup(A2,B:B,B:B)), "data!", ""
),"$",""
),"GO TO SECTION")
So, I have to first select the header I wish in cell A2. Then click "GO TO SECTION" in A3. A popup with the link shows up. Clicking this link takes me to the header I need.
It is still two extra clicks than ideal, but it does the trick.
MT5ManagerAPI.dll is native c++ dll. Better, you will add to reference ManagerAPI.NET.dll - c# wrapper and use it.
using System;
using Manager; // from ManagerAPI.NET.dll
class Program
{
static void Main(string[] args)
{
CManagerApi manager = ManagerFactory.Create();
}
}
You can create a new value that combines the index and a
column and then splits them when generating the axis:
alt.Chart(source).transform_calculate(
label=alt.datum["a"] + "_" + alt.datum["index"]
).mark_bar().encode(
alt.X("label:N", title="A").axis(labelExpr='split(datum.value, "_")[0]'),
alt.Y("b"),
)
I figured it out! so I needed to have mentioned that i am using VS code through harvard's CS50. there is a server online that my VS code is tied to. In order to get past the window of tkinter going to the server and not your own screen, after you have run your program that used tkinker, click on CS50 menu on the left side and then click on GUI to launch noVNC client. This will allow you to see your window get pulled up on a new tab
At the right side vertical button panel click at button "Running Devices" and select your device
After that Layout Inspector will appear inside Tools menu