The error happens because you're trying to assign 3600 timestamps as the index (rows) to a DataFrame that only has 200 rows. The number of items in the index must match the number of rows in the data, as you say the original data has 200 index and in the error it says 3600, 3600.
I think it's too late to change the batch size after some agents have already entered the batch block. So you can try keep track of the number of leftovers before they enter batch block, maybe in the queue block, and changing batch size from there.
Response received from the Xero Support team.
The scope needed to be amended to include the base "practicemanager" and/or "practicemanager.read"
Biodiversidad mexicana, privilegio con responsabilidad
Según el Inventario Español en aquellas latitudes existen 85 peces continentales, mientras que, en el último rincón de México, nuestra Área Natural Protegida, tenemos registro de 366.
En el archipiélago mexicano, que ahora protegemos en su totalidad, existen 983 especies de animales y plantas, de ellas 88 son endémicas; es decir, que no existen en ningún otro lado. Especies mexicanas como la vaquita marina que sólo vive en el Alto Golfo de California, y para la que hemos desplegado un esfuerzo sin precedentes a fin de conservarla.
Nuestra biodiversidad se caracteriza por estar compuesta de un gran número de especies exclusivas. Aproximadamente, la mitad de las plantas que se encuentran en nuestro país son endémicas; es decir, alrededor de 15 mil especies que, si desaparecieran en México, ya no existirían en ningún lado.
Los reptiles y anfibios tienen una proporción de especies endémicas de 57 y 65 por ciento, respectivamente, y los mamíferos (terrestres y marinos) de 32 por ciento.
De acuerdo con la Comisión Nacional para el Conocimiento y Uso de la Biodiversidad (Conabio), existen 23 grupos de especies endémicas, de los cuales tan sólo en el de las magnolias y margaritas hay más de nueve mil 200 endemismos, dos mil 564 de escarabajos, mil 759 de arañas y dos mil 10 de pastos y palmeras.
Como estos ejemplos hay muchos más. Somos, junto con China, India, Colombia y Perú, un país considerado megadiverso. En conjunto albergamos entre el 60 y 70 por ciento de la biodiversidad conocida del planeta.
En México se encuentra representado más del 10 por ciento de la diversidad biológica de todo el mundo, cuando apenas ocupamos el uno por ciento de la superficie terrestre.
Prácticamente todos los tipos de vegetación terrestres conocidos se encuentran representados en nuestro país, y algunos ecosistemas, como los humedales de Cuatrocienégas en Coahuila, sólo están en nuestro territorio.
De esta dimensión es el privilegio natural que tenemos. Una riqueza envidiable que nos abre oportunidades como país para disfrutar, aprovechar y compartir con todo el mundo de manera sostenible.
Nuestro patrimonio natural es también una responsabilidad con el mundo. Su conservación es por el bien de nuestros hijos y las futuras generaciones de todo el mundo. No hay mejor país megadiverso, que uno que sabe lo que tiene y lo protege. Ese es nuestro objetivo.
Why not put embed code in separate file, and open it via new iframe then you can control visibility from anywhere.
Here is sample from our website -> https://www.kraljzara.si/
Just click on [ai] or LJUBO→ai in header of our site.
<exclude-unlisted-classes>false</exclude-unlisted-classes>
you have to use additional plugins like live server or live preview which helps you to see the output of your code in your default browser by clicking on the render button at the top right corner of your vs code
I had a similar issue and resolved it using pip install accelerate
and reloading the notebook kernel I was using.
1)Take the last 4 bits (logic AND with 0xF)
https://bun.sh/docs/bundler/fullstack#using-tailwindcss-in-html-routes seems to document how to this now (it probably did not exist yet when this question was originally asked).
I keep getting the message when I try to install control-3.5.0.tar.gz.
I could use some advice. thanks
./lti_input_idx.cc:96:5: error: unknown type name 'Range'
96 | Range mat_idx (1, idx-offset);
| ^
./lti_input_idx.cc:97:5: error: unknown type name 'Range'
97 | Range opt_idx (idx+1-offset, len);
| ^
2 errors generated.
make: *** [__control_helper_functions__.oct] Error 1
Hello everyone,
First of all, thank you @fabjiro. @fabjiro's answer was quite effective in helping me solve my problem. Using @fabjiro's solution, I further improved it to better suit my own project.
If there's any issue regarding the code blocks I've shared, please feel free to ask or point it out. Wishing everyone good work!
// File Name => ListDataGridPage.tsx
import React from 'react';
import { useLazyGetEmployeesQuery } from '../../../../../redux/slices/services/introductionApiSlices';
import { employeeColumns, EmployeeRowType, ListDataGridRef } from './listDataGridPageTypes';
import ListDataGrid from '../../../../../components/introduction/dataGrid/listDataGrid/ListDataGrid';
import BoxComp from '../../../../../components/base/box/Box';
const ListDataGridPage: React.FC = () => {
const [triggerGetEmployees] = useLazyGetEmployeesQuery();
const listDataGridRef = React.useRef<ListDataGridRef>(null);
// States for infinite scroll implementation
const [rows, setRows] = React.useState<EmployeeRowType[]>([]); // Stores all loaded rows
const [skipCount, setSkipCount] = React.useState(0); // Tracks the number of items to skip
const [loading, setLoading] = React.useState(false); // Prevents multiple simultaneous data fetches
// Function to load more data when scrolling
const loadData = async () => {
if (!loading) {
try {
setLoading(true);
const { data } = await triggerGetEmployees({
maxResultCount: '40', // Number of items to fetch per request
skipCount: skipCount.toString(), // Offset for pagination
});
if (data) {
if (Array.isArray(data.data.items)) {
// Append new items to existing rows
setRows((prev) => [...prev, ...data.data.items]);
// Increment skip count for next fetch
setSkipCount((prev) => prev + 40);
} else {
console.error('Invalid data format: items is not an array', data);
}
}
} catch (error) {
console.error('Error fetching data:', error);
} finally {
setLoading(false);
}
}
};
// Load initial data on component mount
React.useEffect(() => {
loadData();
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
return (
<BoxComp sx={{ height: 500, width: '100%' }}>
<ListDataGrid
ref={listDataGridRef}
rows={rows}
columns={employeeColumns}
onNextPage={loadData}
isLoading={loading}
threshold={5} // Percentage threshold to trigger next page load
/>
</BoxComp>
);
};
export default ListDataGridPage;
// File Name => ListDataGrid.tsx
import React from 'react';
import useLanguageContext from '../../../../hooks/useLanguageContext';
import { ListDataGridProps, ListDataGridRef } from './listDataGridTypes';
import { listDataGridPropsPrepareColumn } from './listDataGridMethods';
import DataGridComp from '../../../base/dataGrid/DataGrid';
import { useGridApiRef } from '@mui/x-data-grid';
const ListDataGrid = React.forwardRef<ListDataGridRef, ListDataGridProps>((props, ref) => {
const { columns, rows, onNextPage, isLoading = false, threshold = 0 } = props;
const { translate } = useLanguageContext();
const apiRef = useGridApiRef();
// Refs for managing scroll behavior
const scrollMonitor = React.useRef<() => void>(); // Tracks scroll event subscription
const isInitialMount = React.useRef(true); // Prevents initial trigger
const isRequestLocked = React.useRef(false); // Prevents multiple simultaneous requests
// Handle scroll events and trigger data loading when needed
const handleScroll = React.useCallback(() => {
// Skip if a request is already in progress
if (isRequestLocked.current) {
return;
}
// Skip the first scroll event after mount
if (isInitialMount.current) {
isInitialMount.current = false;
return;
}
if (apiRef.current?.instanceId) {
const elementScroll = apiRef.current.rootElementRef.current?.children[0].children[1];
if (elementScroll) {
// Calculate scroll positions and threshold
const maxScrollTop = elementScroll.scrollHeight - elementScroll.clientHeight;
const scrollPosition = apiRef.current.getScrollPosition();
const scrollThreshold = maxScrollTop * (1 - threshold / 100);
// Check if we've scrolled past the threshold
if (scrollPosition.top >= scrollThreshold) {
// Lock requests to prevent multiple triggers
isRequestLocked.current = true;
// Trigger the next page load
onNextPage?.();
// Release the lock after a delay
setTimeout(() => {
isRequestLocked.current = false;
}, 1000);
}
}
}
}, [apiRef, threshold, onNextPage]);
// Set up scroll event listener
React.useEffect(() => {
if (apiRef.current?.instanceId) {
// Subscribe to grid's scroll position changes
scrollMonitor.current = apiRef.current.subscribeEvent('scrollPositionChange', () => {
handleScroll();
});
}
// Cleanup scroll event listener on unmount
return () => {
if (scrollMonitor.current) {
scrollMonitor.current();
}
};
}, [apiRef, handleScroll]);
const preparedColumns = React.useMemo(() => {
const preparedCols = columns.map((column) => ({
...listDataGridPropsPrepareColumn(column),
headerName: column.isTranslation === false ? column.headerName : translate(column.headerName as string),
}));
return preparedCols;
}, [columns, translate]);
React.useImperativeHandle(ref, () => ({
getDataGrid: () => apiRef.current,
}));
return (
<DataGridComp
apiRef={apiRef}
columns={preparedColumns}
rows={rows}
showCellVerticalBorder={true}
showColumnVerticalBorder={true}
hideFooter={true}
hideFooterPagination={true}
hideFooterSelectedRowCount={true}
loading={isLoading}
/>
);
});
ListDataGrid.displayName = 'ListDataGrid';
export default React.memo(ListDataGrid);
// File Name => DataGrid.tsx
import useLanguageContext from '../../../hooks/useLanguageContext';
import { getLocaleText } from '../../../utils/locale/dataGridLocales';
import { Language } from '../../../utils/enums/languages';
import { DataGridCompProps, dataGridCompDefaultProps } from './dataGridHelper';
import { DataGrid } from '@mui/x-data-grid';
const DataGridComp = (props: DataGridCompProps) => {
const { ...dataGridProps } = { ...dataGridCompDefaultProps, ...props };
const { language } = useLanguageContext();
return <DataGrid {...dataGridProps} localeText={getLocaleText(language as Language)} />;
};
DataGridComp.displayName = 'DataGridComp';
export default DataGridComp;
// File Name => listDataGridTypes.ts
import { GridApi } from '@mui/x-data-grid';
import { DataGridCompColDef, DataGridCompValidRowModel } from '../../../base/dataGrid/dataGridHelper';
export interface ListDataGridRef {
getDataGrid: () => GridApi | null;
}
export interface ListDataGridProps {
columns: DataGridCompColDef[];
rows: DataGridCompValidRowModel[];
// Function triggered when more data needs to be loaded
onNextPage?: () => void;
// Indicates whether data is currently being fetched
isLoading?: boolean;
// Percentage of scroll progress at which to trigger next page load (0-100)
threshold?: number;
}
You can deploy a Node.js app with AWS Amplify. To do so, connect your Git repository to the Amplify console, configure build settings, and then deploy your application.
You can obtain the HR as follows:
predictions <- predict(cox, newdata = data, type = "lp", se.fit = T)
HR <- exp(predictions$fit)
se.fit will get the fit (log HR) and the standard errors.
Use plugins ( extension tab ) like live server or fileserver. that might be good start. or at the end of top right corner you will be able to find. render icon with page and magnifying glass it would be helpful to render you a document there.
HTMl & CSS Can be only seen through browser so when you need to identify the output you can browse your files and open with any browser you want.
I face the exact issue and not sure what the fix would be
Here is a way using the BOL.
`(?m)^if.*\R+(?:^[\t ]+.*\R+)*?^[\t ]+test\.check\(\).*\R?`
https://regex101.com/r/WR9JtI/1
(?m)
^ if .* \R+
(?: ^ [\t ]+ .* \R+ )*?
^ [\t ]+ test \. check \( \) .* \R?
from datetime import date
# Confirming user preferences for output format and style
confirmation_date = date.today()
confirmation_date.isoformat()
Use puppeteer-real-browser. I tried it today and it works without any problems. Only one moment I spotted, it's when you set turnstile: true
input fields will lose their focus every second. So you can set it as false.
Worked for me!! Thank you. I had the same issue even though I was doing react barebone, not expo, but the fix here worked for me.
setting the Application.MainFormOnTaskbar to False will indeed work, But (I forgot to mention, my bad) I am experimenting with a very old Delphi language and thus setting the Application.MainFormOnTaskbar to False won't work. I instead found a workaround (created a tiny procedure):
procedure TFRMlogin.FRMshow(Form: TForm);
begin
SetWindowLong (Form.Handle, GWL_EXSTYLE,GetWindowLong (Form.Handle, GWL_EXSTYLE) or WS_EX_APPWINDOW) ;
end;
Yikes - 16 years and this still isn't figured out? I'm dealing with a similar situation, with an HR-16 drum machine (translating/converting old sysex dumps to something useable in a DAW). The post from 16 years ago pushed me in a right-er direction, but now that I'm seeing this one, I think I'm giving up - because I barely know anything about this kind of thing. If people who do know about this stuff haven't gotten it to work, what hope do I have... At minimum I'm simply trying to figure out version number of software on HR16 from which my sysex data were dumped (that's supposed to be in the sysex data)...
I have the same problem as you. Did you manage to solve it?
If anyone else has this issue, it seems to be caused by not having any active windows of your own app open.
So the payment dialog tries to centre on the menu icon in the corner, and also hides the Subscribe button for some reason.
This is a Kaggle bug. I've also had success with the mentioned workaround of toggling off/on the internet.
Have you had any luck with this? I noticed this in an App I'm working on that also supports macOS 13 to 15.
It seems to me like the issue is due to the fact that NSHostingView
's sceneBridgingOptions
targets macOS 14.0+, and for older releases the behavior is similar to that of setting an empty array for the options, meaning SwiftUI won't handle us its toolbars for free.
I havent't really found a solution other than having to recreate my toolbars from scratch in AppKit just to support Ventura (which implies manually hooking the AppKit implementation into SwiftUI views), but I'd love to hear if you've had a different experience.
Include the necessary code to start up a Python screen. (Import the library and generate a screen.)
panorama_fish_eye
Create a variable named is_string. Assign it to one of the above values that is actually a string.
panorama_fish_eye
Create a variable named is_string2. Assign it to one of the above values that is actually a string.
panorama_fish_eye
Create a variable named is_integer. Assign it to the above value that is actually an integer.
panorama_fish_eye
Create a variable named is_float. Assign it to the above value that is actually a float.
panorama_fish_eye
Create a variable named is_boolean. Assign it to the above value that is actually a Boolean.
panorama_fish_eye
Oh, found the answer. usort takes the callback name as a string, so the code should read:
<!DOCTYPE html>
<html>
<body>
<?php
function cb($l1, $l2) {
if($l1 == $l2) return 0;
return ($l1 < $l2) ? -1 : 1 ;
}
$res[0] = 102;
$res[1] = 101;
echo var_dump($res);
echo "<br>";
usort($res, "cb"); // notice the quotes around "cb"
echo var_dump($res);
?>
</body>
</html>
I read that earlier PHP versions were tolerant and implicitly quoted such calls...
Following is the sample table:
<body>
<table>
<tr><th>XY</th><th>ZW</th></tr>
<tr><td rowspan="3">321</td><td>242</td></tr>
<tr><td>513256</td></tr>
<tr><td>33131</td><td>13</td></tr>
<tr><td>4131</td><td>334132</td></tr>
<tr><td rowspan="3">51311</td><td>54424</td></tr>
<tr><td>54424</td></tr>
<tr><td>5442</td></tr>
<tr><td>511</td><td>544</td></tr>
</table>
<br />
<input type="text" id="search" placeholder=" live search"></input>
</body>
This can be caused by including extensions in the wrong order. Make sure you're loading htmx first, then the preload extension.
Fix for me was to replace:
/**
* @dataProvider validateTypeProvider
*
*/
with:
#[DataProvider('validateTypeProvider')]
This is the website where you can find the number of partitions required by providing throughput and partition speed-
In addition to Roland's reply, the instruction :
lea rsi, [rsp + rax * -1]
is valid and will compile.
@9769953 When I enter python -m pip install git+https://github.com/VarMonke/GitHub-API-Wrapper.git
I get this error message: ERROR: Cannot find command 'git' - do you have 'git' installed and in your PATH?
So, is this circular (I have to install github in order to install github)?
Heads up, the "next versions of OpenCV", that Juliusz Tarnowski was referring to in his answer, is there now. Yes, the method setLegacyPattern
has been added to the CharucoBoard
class, which allows to specify that the table was generated in an old-fashioned way. Moreover you will no longer get the bad estimation - instead cv2.interpolateCornersCharuco
will return retval=0 and empty arrays if a legacy board is tried to be detected and setLegacyPattern(True)
was not called on the board in the argument of cv2.interpolateCornersCharuco
It turned out to be very difficult executing a command using az (the revision error was not due to me trying to add a new revision, it appeared to be an old ghost from a previous time when multiple revisions were enabled) or trying any variants of the command directly. As a last resort I toggled between single and multiple revision mode (using the portal) to get rid of this mysterious old artefact and then the exactly same container update worked successfully.
This solution doesn't work for me. I am using MVC 5 .
It throws an exception in the OnResultExecuted method saying that Headers are read only.
It is my mistake; I added tested code for validation/test before any updated. Value params in lazy are correctly updated.
I tried executing
conda clean --all
In the command line in the environment and it worked for me
I got it from https://github.com/conda/conda/issues/7038#issuecomment-415893714
The full specs for the command are in https://docs.conda.io/projects/conda/en/stable/commands/clean.html
Here is my approach: I need to populate a dataframe from data for sentiment analysis.
for file_ in sorted(os.listdir(path)):
with open(os.path.join(path, file_), 'r', encoding='utf-8') as infile:
txt = infile.read()
list_data.append([txt, labels[l]])
phar.update()
df = pd.DataFrame(list_data)
df.columns = ['review', 'sentiment']
Thank you very much for your super quick response. Clicking on the link just opens the browser stating "This site can't be reached" even when the phone is connected to the internet
this help me a lot, thank you for indicating your command
debug:
msg: >-
{{ forwarded_ports | json_query("[*].{external_port: @, internal_port: @}") }}
vars:
"forwarded_ports": [
"123",
"234",
"456"
]
You could consider using conda-forge instead of anaconda channel as seems something is off there. 2021.4 is quite old version and it will not supported newer environments and NumPy.
Using miniforge would be simplest option - https://conda-forge.org/download/
Try with this and tell me the results or if any error: intent://com.google.android.apps.maps/#Intent;scheme=android-app;end (Put this as a link, replace your "http:// bla bla bla" with this)
We can add the specific location later
let
Source = Excel.CurrentWorkbook(){[Name="Tabelle1"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Client", type text}, {"ID", Int64.Type}, {"Value", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Client"}, {{"Max ID", each List.Max([ID]), type number}}),
#"Merged Queries" = Table.NestedJoin(#"Grouped Rows",{"Client", "Max ID"},#"Source",{"Client", "ID"},"Source",JoinKind.LeftOuter),
#"Expanded {0}" = Table.ExpandTableColumn(#"Merged Queries", "Source", {"Value"}, {"Value"}),
#"Grouped Rows1" = Table.Group(#"Expanded {0}", {"Client", "Max ID"}, {{"Max Value", each List.Max([Value]), type number}})
in
#"Grouped Rows1"
Another alternative could be Power Query when you apply the above M code. The name of the blue dynamic table in my example is Tabelle1.
I got this name overlay by accident on a sheet and could not figure out how I got it or how to turn it off until I found this comment. Hooray! It's gone now. I had inadvertently set the zoom to 30%.
I’m experiencing the same issue. It works fine on Android, but on iOS, the app crashes when the account deletion confirmation dialog appears. If I set showDeleteConfirmationDialog: false, the deletion works fine. I’ll leave it like this until a solution is found.
I am searching thus answer too
Got the same problem when migrated from LangChain
v0.1
to v0.3
.
I didn't really want to refactor half of my project, so I had to think a bit.
The only idea I had was to create my own LLMChain
and just replace the imports.
Here's what I got:
from langchain.chains.llm import BaseMessage
from langchain_core.runnables import RunnableSerializable
LLMChain = RunnableSerializable[dict, BaseMessage]
And all you need is replace this
from langchain.chains.llm import LLMChain
to this
from path.to.llm_chain import LLMChain
But then I ran into the fact that my LLMChain
doesn't have a run
method and that I need to use invoke
.
I had no idea so I replaced it, but if anyone has any ideas please add to this discussion.
Hope I helped anyone in some way :D
i have now found the solution in python and dont need it anymore
from docx import Document
from docx.shared import Inches, Pt
from docx.enum.text import WD_PARAGRAPH_ALIGNMENT
from docx.oxml.ns import qn
from docx.oxml import OxmlElement
from PIL import Image
import os
# Load logo and resize
logo_path = "/mnt/data/file-7xXBwpX9eZCykswwjBMnV3"
logo = Image.open(logo_path)
logo_resized_path = "/mnt/data/logo_resized.png"
logo.thumbnail((150, 150))
logo.save(logo_resized_path)
# Create document
doc = Document()
# Title page with logo
doc.add_picture(logo_resized_path, width=Inches(1.5))
doc.paragraphs[-1].alignment = WD_PARAGRAPH_ALIGNMENT.CENTER
doc.add_paragraph().add_run("ASOCIACIÓN DEPORTIVA IBA CHILE").bold = True
doc.paragraphs[-1].alignment = WD_PARAGRAPH_ALIGNMENT.CENTER
doc.add_paragraph().add_run("Selección Oficial de Karate - Temporada 2025").bold = True
doc.paragraphs[-1].alignment = WD_PARAGRAPH_ALIGNMENT.CENTER
doc.add_paragraph().add_run("Compromisos y Requisitos para Atletas y Apoderados").bold = True
doc.paragraphs[-1].alignment = WD_PARAGRAPH_ALIGNMENT.CENTER
doc.add_page_break()
# Introduction
doc.add_heading("Introducción", level=1)
doc.add_paragraph(
"Estimadas familias:\\n"
"La presente carta tiene como objetivo formalizar los compromisos y responsabilidades que deben asumir tanto los atletas seleccionados "
"como sus respectivos apoderados o tutores, en el marco del proceso competitivo y formativo de la Selección Oficial de Karate de la Asociación IBA Chile "
"para la temporada 2025.\\n\\n"
"Representar a nuestra asociación en competencias regionales, nacionales e internacionales es un privilegio que implica esfuerzo, disciplina y trabajo conjunto. "
"Por ello, es fundamental establecer un marco de compromiso y responsabilidad que asegure un proceso serio, respetuoso y coherente con los valores del karate "
"y del deporte en general.\\n\\n"
"A continuación, se detallan los compromisos que deberán ser asumidos de forma íntegra por cada parte involucrada."
)
# Commitments
doc.add_heading("Compromisos del Atleta", level=1)
athlete_points = [
"Asistir puntualmente a todos los entrenamientos, evaluaciones y actividades programadas por el cuerpo técnico.",
"Mantener una conducta respetuosa, disciplinada y colaborativa con compañeros, entrenadores y dirigentes.",
"Cumplir con las indicaciones técnicas, físicas y de preparación entregadas por el equipo de trabajo.",
"Usar correctamente el uniforme de la selección y mantener una buena presentación personal.",
"Participar activamente en actividades de preparación física, técnica, táctica y mental."
]
for point in athlete_points:
doc.add_paragraph(point, style='List Number')
doc.add_heading("Compromisos del Apoderado o Tutor", level=1)
parent_points = [
"Velar por la asistencia y puntualidad del atleta a todas las actividades programadas.",
"Mantener una comunicación fluida y respetuosa con entrenadores y directivos.",
"Apoyar el proceso formativo y competitivo del atleta, fomentando valores como la responsabilidad, el respeto y el compromiso.",
"Cumplir con los aportes económicos necesarios para inscripciones, traslados, implementos u otros gastos asociados.",
"Firmar las autorizaciones necesarias para viajes, actividades o compromisos especiales.",
"Responsabilizarse por la alimentación del atleta y colaborar en el control del peso corporal, asegurando que se mantenga dentro de los rangos establecidos para su categoría competitiva."
]
for point in parent_points:
doc.add_paragraph(point, style='List Number')
# Final section
doc.add_heading("Disposiciones Finales", level=1)
doc.add_paragraph(
"La firma del presente documento representa el compromiso formal y voluntario de cada parte para cumplir con los puntos mencionados anteriormente. "
"El incumplimiento de alguno de estos compromisos podrá conllevar sanciones internas o exclusión del proceso selectivo, de acuerdo a las normas internas "
"de la Asociación IBA Chile."
)
# Signature section
doc.add_page_break()
doc.add_heading("Firmas de Compromiso", level=1)
table = doc.add_table(rows=7, cols=2)
table.style = 'Table Grid'
signers = [
("Nombre del deportista:", "Firma del deportista:"),
("Nombre del apoderado:", "Firma del apoderado:"),
("Nombre del entrenador 1:", "Firma entrenador/a 1:"),
("Nombre del entrenador 2:", "Firma entrenador/a 2:"),
("Nombre del comité de la federación:", "Firma comité de la federación:"),
("Nombre Presidenta de la Federación:", "Firma Presidenta de la Federación:"),
("Fecha:", "Lugar:")
]
for row, (label1, label2) in zip(table.rows, signers):
row.cells\[0\].text = label1
row.cells\[1\].text = label2
# Save Word and PDF
docx_path = "/mnt/data/Compromiso_Selección_Karate_IBK_2025.docx"
pdf_path = "/mnt/data/Compro
miso_Selección_Karate_IBK_2025.pdf"
doc.save(docx_path)
docx_path # Returning path for download
You need to follow these steps: 1 - npx expo-doctor // (check for any outdated dependencies) // If there are any outdated dependencies, run 2 - npx expo install --check 3 - npm install [email protected]
4 - add the following to the 'app.json' file: enter image description here
5 - Once this is done, all you have to do is build the app for Android: 'eas build --platform android'. You don't need to run any prebuild.
You can try to restart the development machine and device or emulater
If using physical device check your USB cable is properly connected and you have enabled USB debugging in your device
Close and restart the emulator
So it turns out that defining a
and b
outside of the __init__
constructor function defines those variables as so-called class variables which are shared across all "instances" (objects) of that class. This does not explain how the values of b
are kept separate, but defining a
inside of the constructor as self.a = [0, 0]
seems to do the trick.
It's a good idea to have cors restriction for this but you should not rely solely on a cors restriction. A cors restriction does not prevent a call to your endpoint, but only prevents the response of the call from being read.
Cors restrictions also do not work if a malicious user calls your endpoint from a command line or non reputable browser.
To make your endpoint secure, you should rely on ensuring the caller has proper authorization to use the endpoint.
TLDR: the problem was mistake in an Apache mod_rewrite directive; it had nothing to do with Asset Mapper.
As noted in one of my comments above: it was because of an Apache mod_rewrite condition in my vhost config, which needed to be either inside a Directory block or else be written as `RewriteCond %{DOCUMENT_ROOT}%{REQUEST_FILENAME} !-f` , i.e., including the %{DOCUMENT_ROOT}. The test was coming back false, so files that did exist were not getting served. As for the issue with the dev vs prod, I'm not sure it was a total illusion, but it led me down the wrong path (so to speak). Of course, this whole question I've posted is such an embarrasment, I'm tempted to remove it. But there is a chance someday it could help somebody.
Simple fix:
Go to Play Console → Setup → App Signing, copy the SHA-1 and paste it into Firebase Console > Project Settings > Android App.
Download a new google-services.json
after adding the key and add it to your app.
Go to Play Console → App Integrity → Link project, and link your Firebase project.
That’s it. Worked like a charm after publishing!
Found a solution based on Typescript convert type Tuple of tuple to Tuple (flatten Tuple):
// Solution using recursive type
type FlattenSortOptions<T> = T extends readonly [infer First, ...infer Rest]
? First extends { children: readonly unknown[] }
? [...FlattenSortOptions<First["children"]>, ...FlattenSortOptions<Rest>]
: [First, ...FlattenSortOptions<Rest>]
: [];
function flattenSortOptions<T extends SortOptions<unknown>>(
options: T,
): FlattenSortOptions<T> {
return options.flatMap((option) => {
if (typeof option === "object" && option !== null && "children" in option) {
return option.children;
}
return option;
}) as FlattenSortOptions<T>;
}
does anybody know can I send file (pict) to synology chat not by direct link but as file? or I have to upload it anywhere, get link and only then send it?
In general, you can create a HTTP redirect to the according [SPARQL graph store protocol request](https://www.w3.org/TR/2013/REC-sparql11-http-rdf-update-20130321/#http-get) of a SPARQL endpoint.
However, I did not test, whether this is supported by Virtuoso. Its SPARQL implementation is incomplete.
Also note the `ResultSetMaxRows` configuration of your Virtuoso instance, which might cause incomplete results.
Seems like a bug. It works with OpenAI's implementation.
I opened a issue: https://github.com/langchain4j/langchain4j/issues/2815
in my case it was TypeScript Version in IDE
Stake Deposit Failed Problem
Dear Stake Support Team,
On 07 March 2025, I initiated a deposit of 500 INR Rupees to my Stake account wallet. Unfortunately, the amount has been debited from my Bank Account but has not been credited to my Stake wallet. Additionally, I have not received any refund in my bank account.
As it has been more than 26 day so kindly resolve this matter promptly by either crediting the amount to my stake wallet or refunding it back to my bank account.
Details of Transaction.
Amount 500 INR Rupees
UPI Transaction ID-506639338002
Stake Deposit ID-2512875571
Stake Account ID/Gmail [email protected]
Phone Number-7069256145
Stake ID Name - Gopal Parmar
Please reply as soon as possible and 2 Screenshots of Transaction are attached below.
Thank You,
Best regards,
Gopal Parmar
2 and 3 can be reordered. See Peter Cordes's answer here: https://stackoverflow.com/a/77080997/19260728
The store side of the ++x and the subsequent acquire load can be reordered unless you have a StoreLoad barrier - either std::atomic_memory_fence(std::memory_order_seq_cst), or just making load #3 also seq_cst
In my case, the date passed from Python to SQL was not formatted correctly. Once it was passed as 'yyyy-mm-dd' to the SQL Statement, then problem was solved.
The error you're seeing:
No module named 'seaborn.external.six.moves'
usually indicates that your version of Seaborn is outdated or not compatible with your current Pandas or Python setup.
This error is often fixed in newer versions. In your terminal or Jupyter Notebook, run:
pip install seaborn --upgrade
Or in a Jupyter cell:
!pip install seaborn --upgrade
After updating, make sure to restart your Jupyter Notebook kernel to reload the updated package.
In Jupyter:
Kernel > Restart Kernel
To confirm you're using a recent version:
import seaborn as sns
print(sns.__version__)
✅ As of now, the latest stable version is
0.12.2
or above. Make sure you're at least on version0.11.0
.
You may need to also upgrade your Pandas and Matplotlib versions:
pip install pandas --upgrade
pip install matplotlib --upgrade
add this in package.json
under scripts
"vercel-build": "npm ci && npm run build",
(Being this my first interaction in Stackoverflow I hope I am not breaking any rules by confirming @AlexanderMashin's answer (in the comments section) with a detailed explanation from DeepSeek)
@AlexanderMashin pointed me to the right direction. I asked DeepSeek (DeepThink R1):
"Explain me why in Lua when modifying a copy of a table, made in a non deep copy method, for example by declaring a `local table_copy = table_original`, the original table changes too? Why does this happen?"
DeepSeek's answer confirmed points 1) and 4) of @AlexanderMashin's answer in his comments:
"1) Lua tables are passed by reference. You need to clone a Lua table if you want to change the copy without assfecting the original. 4) is the ulimate purpose of your code to create a table based on a template, in which metacharacters are replaceb by vaues from anothe table?"
Here is my question to DeepSeek:
"Explain me why in Lua when modifying a copy of a table, made in a non deep copy method, for example by declaring a `local table_copy = table_original`, the original table changes too? Why does this happen?"
And it's answer was:
"In Lua, tables are reference types, not value types. When you assign a table to a new variable using local table_copy = table_original
, you are not creating a new table. Instead, you are creating a new reference to the same underlying table data in memory. This means:
Tables are references:
When you write table_copy = table_original
, both table_copy
and table_original
point to the same table in memory.
Changes made via either variable will affect the same underlying data.
local original = {value = 10}
local copy = original -- Both variables reference the same table
copy.value = 20 -- Modify the shared table
print(original.value) -- Output: 20 (original is changed!)
Shallow copies are not enough for nested tables:
Even if you manually copy keys with a loop (a "shallow copy"), nested tables inside the original table will still be referenced, not cloned:
function shallowCopy(original)
local copy = {}
for k, v in pairs(original) do
copy[k] = v -- Copies primitive values (numbers, strings) but references nested tables
end
return copy
end
local original = {nested = {value = 10}}
local copy = shallowCopy(original)
copy.nested.value = 20 -- Modifies the original's nested table
print(original.nested.value) -- Output: 20
--------------------------------
A deep copy recursively clones all nested tables, ensuring no shared references:
function deepCopy(original)
local copy = {}
for k, v in pairs(original) do
if type(v) == "table" then
copy[k] = deepCopy(v) -- Recursively clone nested tables
else
copy[k] = v -- Copy primitives directly
end
end
return copy
end
local original = {nested = {value = 10}}
local copy = deepCopy(original)
copy.nested.value = 20
print(original.nested.value) -- Output: 10 (original remains unchanged)
Assignment (=
) does not clone tables; it creates a new reference to the same data.
Shallow copies copy top-level keys, but nested tables remain shared references.
Deep copies are required to fully clone all nested structures.
This behavior is intentional in Lua to avoid unnecessary memory duplication, but it requires awareness when working with mutable data."
Conclusion:
For me as a beginner this is a revelation, now I will act accordingly to this behavior of Lua tables that had me perplexed. "This behavior is intentional in Lua to avoid unnecessary memory duplication, but it requires awareness when working with mutable data." Thanks to @AlexanderMashin !
As @Sean pointed out the .background
would need to be a parent of h1
in order for mix-blend-mode effect to apply to it. To demonstrate, just add background-color: #0044bb
to h1's immediate parent .hero .foreground .container
and you'll see the letters inside .container
will be black.
Quote from the updated guidance :
"We're taking action to ensure the continued operation of deployed VPN gateways that use Basic SKU public IP addresses until the retirement of Basic IP in September 2025. Before this retirement, we'll provide customers with a migration path from Basic to Standard IP.
However, Basic SKU public IP addresses are being phased out. Going forward, when you create a VPN gateway, you must use the Standard SKU public IP address. You can find details on the retirement of Basic SKU public IP addresses in the Azure Updates announcement."
https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/public-ip-basic-upgrade-guidance
class MyPandasData(bt.feeds.PandasData):
params = (
('open', 1),
('high', 2),
('low', 3),
('close', 4),
('volume', 5),
)
I doubt the issue is with function cols
. Is there any built-in functionality named cols
to map columns?
Can you try Params
instead of cols
?
I think @Dominik Kasewski's answer is more appropriate for what you want to do but there is also the configure_file
mechanism
configure_file(myfile.h.in myfile.h)
CMake will take myfile.h.in
and expand cmake variables in it:
static constexpr const char * build_type = "${CMAKE_BUILD_TYPE}";
to create a file that
static constexpr const char * build_type = "Debug";
for example.
${...}
constructs in the file and dont want those expanded you can also use @CMAKE_BUILD_TYPE@
and add the @ONLY
argument to configure_file
target_compile_definitions
.Since this is more complicated than what @Dominik Kaszewski suggests, I would go with their answer but if you have more information that you want to pass from CMake to the source code, then this is a convenient way to do it.
Based on your query "I want SAA I and SAA O," I’ll assume you’re asking for a breakdown of transactions in your dataset where "SAA I" and "SAA O" refer to specific conditions, and you want metrics like the number of transactions and total value for each. Since no specific metrics were provided, I’ll interpret "SAA I" as transactions where the T_FROMAPPLI
column equals "SAA" and the I_O
column equals "I" (Input), and "SAA O" as transactions where T_FROMAPPLI
equals "SAA" and I_O
equals "O" (Output). I’ll provide a step-by-step guide to achieve this in Power BI, along with results based on the sample data.
T_FROMAPPLI
= "SAA" and I_O
= "I".T_FROMAPPLI
= "SAA" and I_O
= "O".AMOUNT
column for those rows.This interpretation aligns with the dataset’s structure, where T_FROMAPPLI
indicates the application source (e.g., "SAA") and I_O
indicates the transaction direction (Input or Output).
T_FROMAPPLI
, I_O
, and AMOUNT
are correctly typed:
T_FROMAPPLI
and I_O
as Text.AMOUNT
as Decimal Number.To calculate the metrics, we’ll create measures in Power BI’s Data Analysis Expressions (DAX).
Go to the Modeling Tab:
Define Measures:
SAA I Count = CALCULATE(COUNTROWS(Transactions), Transactions[T_FROMAPPLI] = "SAA", Transactions[I_O] = "I")
SAA I Value = CALCULATE(SUM(Transactions[AMOUNT]), Transactions[T_FROMAPPLI] = "SAA", Transactions[I_O] = "I")
SAA O Count = CALCULATE(COUNTROWS(Transactions), Transactions[T_FROMAPPLI] = "SAA", Transactions[I_O] = "O")
SAA O Value = CALCULATE(SUM(Transactions[AMOUNT]), Transactions[T_FROMAPPLI] = "SAA", Transactions[I_O] = "O")
Add a Table Visual:
SAA I Count
, SAA I Value
, SAA O Count
, SAA O Value
) to the Values area.Format the Table:
Alternative: Use Card Visuals:
From the sample data provided (23 rows), let’s check:
T_FROMAPPLI
= "SAA" and I_O
= "I"):
T_FROMAPPLI
= "SAA" and I_O
= "O"):
T_FROMAPPLI
= "SAA", I_O
= "O", AMOUNT
= 21526.52 (USD).T_FROMAPPLI
= "SAA", I_O
= "O", AMOUNT
= 20171.83 (AED).Output in Power BI: | Metric | Value | |-----------------------|----------| | SAA I Count | 0 | | SAA I Value | 0 | | SAA O Count | 2 | | SAA O Value | 41698.35 |
For a different layout:
I_O
(shows "I" and "O").T_FROMAPPLI
to "SAA" under Filters on this visual.T_FROMAPPLI
= "SAA" and I_O
= "I"). Your full dataset might have them, and the measures will reflect that.AMOUNT
to one currency (e.g., USD) using an exchange rate (e.g., 1 AED = 0.27 USD), then adjust the measures.SYSTEM_ID
containing "SAA"), let me know, and I’ll adjust the filters.This solution provides a clear, actionable way to see "SAA I" and "SAA O" metrics in Power BI. Let me know if you need further customization!
I haven't gotten around to implement this, but @Peter Cordes proposed that I should take a look into the Linux Kernel.
Thanks!
I have solved it by moving the assets folder from lib folder to the root of the project.
If you only need to add "like" functionality then you can remove the liked
column from your table. As you said if a user likes a image you are going to add a new row to likes table, but if you want to add "dislike" feature in the future then you need the liked
column. In that case if the user likes an image you add a new record with the liked
column set to true
and if the user dislike an image you add a new record and set liked
column to false
.
I realized for some reason there were still external react elements in the built files. I used bunchee
as the build tool and it solved the issue.
Thank you, it works :)
"watcher": {
"files": "**/*.{jss,css,css.map,.min.css,.min.js}",
"autoUpload": true,
"autoDelete": false
}
when we put
"autoUpload": true
well between the curly brackets of the "watcher" parameter, it works perfectly :)
Found the issue.
PHP doesn't like white spaces near the <?php
tag.
Went through all the files to look for tabs/spaces/newline near the <?php
and deleted them. I had an empty line in my web.php
file.
That was painful to figure out
Também estou com o mesmo problema, pois criei um pequeno game em Delphi 12.1 CE Firemonkey e a música tem que tocar em loop. Nem usando TTimer, TThread não conseguiu funcionar em Android 64-release, ou seja, em produção.
Só funciona perfeitamente em Android-32/64, apenas no modo debug ou no Windows 32/64 (debug ou release).
Emirates Әуежайы және Компаниясы туралы қысқаша презентация (Бағасы қосылған)
---
1. Кіріспе
Emirates — әлемнің жетекші әуе компанияларының бірі.
Дубай халықаралық әуежайы (DXB) — әлемдегі ең көп жолаушы тасымалдайтын әуежайлардың бірі.
Әуежай жыл сайын 90 миллионнан астам жолаушыға қызмет көрсетеді.
Бағасы: Emirates әуе билеттерінің бағасы бағытқа, саяхат уақытына және ұшу классына байланысты өзгереді. Орташа баға эконом-класстағы билет үшін 500-1000 АҚШ долларын құрайды.
---
2. Дубай халықаралық әуежайы (DXB)
Әлемдегі ең үлкен транзиттік хаб.
Ұшу-қону жолақтары: 4,5 км ұзындығы бар әлемдегі ең ұзын жолақтардың бірі.
Emirates үшін арнайы терминал: 3 терминалдың бірі тек Emirates рейстеріне арналған.
---
3. Emirates Әуе Компаниясының Ерекшеліктері
Әлемдік деңгейдегі қызмет: Бизнес және бірінші класс жолаушылары үшін жеке бөлмелер мен VIP қызметтер.
Airbus A380 флоттары: Әлемдегі ең үлкен A380 флоттарының бірі.
SkyCargo: Әуе компаниясы әлем бойынша жүк тасымалдаудың жетекші ойыншысы.
---
4. Экологиялық Жауапкершілік
Жасыл технологиялар: Экологиялық таза ұшақтар мен энергия тиімділігін арттыру.
Көміртек шығарындыларын азайту мақсатында жаңа ұшақтар пайдалануға енгізілген.
---
5. Қорытынды
Emirates әуе компаниясы мен Дубай халықаралық әуежайы әлемдегі ең үздік әуе тасымалдарын ұсынатын орталықтар болып табылады.
Бағасы: Қызмет көрсетудің жоғары сапасына сәйкес, Emirates рейстерінің бағасы әдетте жоғары деңгейде, бірақ сапа мен ыңғайлылыққа сәйкес келеді.
---
What you want can be achieved with #ifdef
. Since debug mode commonly sets -DDEBUG=1
(or -DDBG=1
, you can easily control it either way), you can check for it to determine the mode you are in:
#ifdef DEBUG
static constexpr const char* mode = "debug";
#else
static constexpr const char* mode = "release";
#endif
printf("You are in %s mode\n", mode);
Other config options can be passed as preprocessor flags in the same way.
In this case, sort() works :
> vec <- c("x", "y", "z")
> vec <- sort(c(vec, paste0(vec, ".1")))
> vec
[1] "x" "x.1" "y" "y.1" "z" "z.1"
I would recommend you to use a popular library called Socket.io
if you are into express
and node
.
The examples I have added uses React as well.
Server
"Receive" events int the server and do the backend and database stuff.
server/index.js
const io = require('socket.io')(server, {
// conifg options
})
io.on('connection', (socket) => {
socket.on('setup', (userData) => {
// Code to be executed when the socket is set.
})
socket.emit('connection')
socket.on("join chat", (room) => {
// When someone joins the chat
})
socket.on("new message", (newMessageReceived) => {
// When new message is received
})
// Can also add any other relevant events...
})
Client
"Emit" events in the client as the user interacts.
client/components/src/ChatArea.jsx
// Setup the socket on the client side.
useEffect(() => {
socket = io(ENDPOINT)
socket.emit('setup', user)
socket.on('connection', () => {
setSocketConnected(true)
})
}, [])
// Emit the "new message" event.
socket.emit('new message', data)
You can refer to my chat application: https://github.com/Jay-Karia/Hello
Main files:
server: https://github.com/Jay-Karia/Hello/blob/main/server/index.js#L28-L60
client: https://github.com/Jay-Karia/Hello/blob/main/client/src/Components/ChatArea.jsx#L80
Its probably your browser. I was using chrome
and only protocols which were being shown were TCP, UDP, TLSv1.3
and so on. Then I switched my browser to safari
and it worked
I have used fs.azure.data.blocks.buffer: bytebuffer in my flink configuration and this fixed the above issue.
You're correct—Python natively supports neither true constants like C, C++, nor Java. There is no const keyword, and even the hack of creating variables in all caps (PI = 3.14) is completely dependent on developer discipline.
Although workarounds such as metaclasses or special classes can be used to mimic immutability, none are able to prevent reassignment at the module level, or prevent an astute user from simply overwriting your values.
Therefore, I created something that does.
Introducing setconstant: Immutability in Python Constants
Python is not immutable, and that can have unintended value mutations—particularly in big codebases. So I made setconstant, a light Python package allowing you to define true constants that cannot be mutated after declaration.
How It Works
import setconstant
setconstant.const_i("PI", 3)
setconstant.const_f("GRAVITY", 9.81)
setconstant.const_s("APP_NAME", "CodeHaven")
print(setconstant.get_constant("PI")) # Output: 3
print(setconstant.get_constant("GRAVITY")) # Output: 9.81
print(setconstant.get_constant("APP_NAME")) # Output: CodeHaven
# Try to change the value
Once declared, constants are locked. Any attempt to overwrite them throws a ValueError, protecting your code from bugs or unintended logic changes.
Why Not Just Use a Class or Metaclass?
class Constants:
PI = 3.14
def __setattr__(self, name, value):
raise AttributeError("Cannot reassign constants")
Or even using metaclasses like:
class ConstantMeta(type):
def __setattr__(cls, name, value):
if name in cls.__dict__:
raise AttributeError(f"Cannot modify '{name}'")
super().__setattr__(name, value)
But these solutions have their limitations:
They can be circumvented with sufficient effort.
They are not intuitive to beginners.
They need boilerplate or more advanced Python functionality such as metaclasses.
setconstant circumvents that. It's simple, strict, and does what constants are supposed to do.
Installation
pip install setconstant
Feedback is always welcome!
Anuraj R, Creator of setconstant
1 . Use async in main.
void main()async {}
2 . Uses initialize in main,
WidgetsFlutterBinding.ensureInitialized();
await Firebase.initializeApp();
final code
void main()async {
WidgetsFlutterBinding.ensureInitialized();
await Firebase.initializeApp();
runApp(const MyApp());
}
From those tables, I created a view called USCalendar that calculates the end date, given a start date (03/01/21) and number of business days (180) - works perfectly!
Refer to my answer on a similar question: https://stackoverflow.com/a/79558341/14164142
TLDR; Electron PowerMonitor API does not work everywhere (as of right now, April 2025), so I built a different way to do it, which you can use in the interim.
Use the "call Data_Viewer_List1
's Refresh Data
" in your code every time your database is updated.
I can also confirm that as of Ubuntu 24.04.2 LTS the default fail2ban configuration of backend = systemd
does not seem to work properly for jails monitoring external log files, for example postfix-sasl
. After adding backend = auto
to all jails in /etc/fail2ban/jail.local
, bans immediately started to appear in /var/log/fail2ban.log
.
For me changing the DATABASE_URL to DIRECT_URL in schema.prisma worked.
You might need to add a width into your parent div. The div that contains the button does not have any width therefore it is shrinked as it minimum, you may have to set a width for this parent div? You also have not any height in any div so you will not be able to align the content in the center of the page vertically.
I want to reply to my own question just to update this post.
Sorry if i'm writing only after many days of silence, i readed all the suggestion and i want to say thanks to everyone!
I've experimented further with the integration of the library (and with many other things), i successfully used "Method 2" but i like to maintain my project as clean as possible so i didn't like that my folder/file tree was filled by numerous header and source file.
I ended up compiling the library into a shared library .so in my case and use it in my project in a much cleaner way (imho) basically using only dll and header file.
Warning keep appearing but for now it's fine like this.
I recently encountered a similar problem (error C3859: Failed to create virtual memory for PCH) where the root cause was Windows Explorer leaking memory. It had accumulated a commit of over 30GB in just 10 days of uptime. I have 32GB of RAM and limit the page file to 32GB, so in combination with everything else the system had actually run out of memory. Terminating the process solved the issue (for now).
Make sure the Paid Apps Agreement is active — this might be the cause of the issue.
Unfortunately as of right now (April 2025) there are issue with Electron PowerMonitor on Linux systems, specifically when you are using Wayland under the hood (which is the default). I won't pretend I understand everything about this issue, but it does seem that it might be an upstream issue (meaning there is not much Electron can do about it directly), meaning a fix might take some time to appear (source: https://github.com/electron/electron/issues/27912)
Personally I ended up building my own native code to do this in a cross platform way, and ended up forking, updating and re-releasing a project that was tackling the same issue 8 years ago. I plan to keep using it and supporting it for as long as this remains an issue remains, hope someone else finds it helpful.
So , I don't know what is the reason but from a fellow commentor on YouTube , I learnt that if you execute {requires java.desktop;} within your java module file within src location project > src > module info.java , all jdk modules will be shown again and ready to be used. credit: @Eng.Mahmoud-t4y