Faced the same problem. Used:
optimizer=tf.keras.optimizers.Adam()
in place of:
optimizer = tf.train.AdamOptimizer()
inside the compile object
(my tf version is 2.11.0. This method works both for Tf-cpu & Tf-gpu)
I implemented @Mark's answer as a LibreOffice Calc macro, made the number of decimals an option, and added the rest of the prefixes:
' Display a number in engineering notation, using SI prefixes.
' Based on https://stackoverflow.com/a/55382156
Option VBASupport 1
Function ENG(value as Double, decimals)
normalized = ROUND(value / (1000 ^ INT(LOG(ABS(value))/log(1000))),decimals)
prefix = CHOOSE(INT(LOG(ABS(value))/log(1000)) + 11, _
"q","r","y","z","a","f","p","n","µ","m","", _
"k","M","G","T","P","E","Z","Y","R","Q")
ENG = normalized & prefix
End Function
I expect it would work in Excel after removing the Option VBASupport 1 line, but haven't tested that yet.
Once it's added and macros are enabled it can be used like any other function in a cell formula (eg =ENG(A1,2)).
hmm.. by dafult browser trigger 3 phases of event propagation when any event clicked. during that any element which is subscribed for that event will be called
yes, MobileSubstrate framework has MSHookFunction which can do what you said
This is, in fact, a problem with windows.
More specifically, the default windows terminal emulator has ctrl+v bound to paste by default.
To fix this, click the dropdown arrow at the top of the window, open settings, and remove the shortcut under the 'Actions' section.
Try updating thr ngrok in powershell . I was facing the same issue. I updated the ngrok version and was able to get 200 OK response from webhook API
If you want local date:
java.sql.Date sqlDate = java.sql.Date.valueOf(LocalDate.now());
Recently I learned that README.md can go inside .github directory. It will still be visible to repo visitors.
Example is Ryan Dahl's blog:
If you are doing this to solve Wordle puzzles, you are lacking the additional constraints that would arise if you've guessed a word that has some letters that must be present but are not in the right place. Here is a piped command line that applies all the known constraints as an example (this is for Windows cmd.exe, modify as needed for Linux or MacOS command syntax):
grep -E "^[a-z]{5}$" c:\bin\words.txt | grep "pr.[^i]." | grep -v [outyase] | sed "/i/!d"
Explanation:
The first grep command uses -E to enable extended regular expressions. "^" matches the beginning of the line and "$" matches the end of the line, so this returns all 5-letter words (without using -E, you could say grep "^[a-z][a-z][a-z][a-z][a-z]$" instead). The second argument is a file with a word dictionary.
In the second command, a 5-letter pattern "pr.[^i]." is given; the first two letters are "pr", the third and fifth can be anything, and the fourth letter can be anything but "i". If you have more than one letter that cannot be in one position, just include all letters within the brackets. The ^ within the brackets signifies letters to NOT match.
The fourth command returns all remaining words that don't contain any of the letters [outyase]. The argument -v prints non-matching lines.
The fifth command uses sed rather than grep to delete any remaining words that don't contain i. You could specify an additional letter, say m, with sed "/i/!d;/m/!d".
Adams Street Partners Digital Marketing Agency of the future come join me in this beautiful everlasting journey with Christ
`
your text
| ` | Column A | Column B |
|---|---|---|
| Cell 1 | Cell 2 | |
| Cell 3 | Cell 4 | |
| Column A | Column B | |
| -------- | -------- | |
| Cell 1 | Cell 2 | |
| Cell 3 | Cell 4 |
Will you be able to come up with a SQL query that returns only the new rows/items added? If so You can refer this doc to read data from a Databricks spark job using a query.
https://docs.snowflake.com/en/user-guide/spark-connector-use#using-the-connector-in-scala
val df = sparkSession.read.format(SNOWFLAKE_SOURCE_NAME)
.options(sfOptions)
.option("query", query)
.option("autopushdown", "off")
.load()
val df: DataFrame = sqlContext.read
.format(SNOWFLAKE_SOURCE_NAME)
.options(sfOptions)
.option("query", "SELECT DEPT, SUM(SALARY) AS SUM_SALARY FROM T1")
.load()
pls check your css and html files are in same folder..
ex) link rel="stylesheet" href="styles.css"
You shouldn't put the test project inside the main project, you should create a solution then link the main project "webapi" and the test project.
on your new folder create the test project using the commands it's more stable I'd say:
# dotnet new xunit -n "Webapi.Tests"
your main project should be in the same folder
WebapiProject
|__ Webapi
|__ Webapi.Tests
Note that you should create a solution and add them together into a single solution, also don't forget to add the main project reference inside the test project.
There is a widget called ListwheelScrollView, it can help you achieve that
you can check that from here https://www.youtube.com/watch?v=dUhmWAz4C7Y
I am having the same issue but when creating comps at different frame rates or aspect ratios. It works fine in the floating panel but it falls apart when I am trying to make it dock-able. What would be a good approach to solving my issue at hand?
var button_margin = 20,
button_width = 160,
button_height = 35,
top_margin = 80,
panel_margin = 10;
var l_button_left = button_margin,
l_button_right = l_button_left + button_width,
m_button_left = l_button_right + button_margin,
m_button_right = m_button_left + button_width,
r_button_left = m_button_right + button_margin,
r_button_right = r_button_left + button_width,
t_button_top = top_margin,
t_button_bottom = top_margin + button_height
m_button_top = t_button_bottom + button_margin,
m_button_bottom = m_button_top + button_height,
b_button_top = m_button_bottom + button_margin,
b_button_bottom = b_button_top + button_height;
var win_width = (panel_margin * 2) + (button_width * 3) + (button_margin * 4),
win_height = (panel_margin * 2) + (button_height * 3) +(button_margin * 3) + top_margin,
panel_right = win_width - panel_margin,
panel_bottom = win_height - panel_margin;
var win = new Window('palette', 'Floating Comp Creator'),
panel = win.add('panel', [panel_margin, panel_margin, panel_right, panel_bottom]),
divider0 = panel.add('panel', [l_button_left, panel_margin * 4, r_button_right, panel_margin * 4]),
divider1 = panel.add('panel', [l_button_right + (button_margin / 2), t_button_top - (panel_margin * 2.5), l_button_right + (button_margin / 2), b_button_bottom]),
divider2 = panel.add('panel', [m_button_right + (button_margin / 2), t_button_top - (panel_margin * 2.5), l_button_right + (button_margin / 2), b_button_bottom]),
label0 = panel.add('statictext', [m_button_left, panel_margin, m_button_right, button_height], 'Click to create the desired comp.'),
label1 = panel.add('statictext', [l_button_left + (button_width / 2) - 12, t_button_top - 22, l_button_right, t_button_top - 12], '16:9'),
label2 = panel.add('statictext', [m_button_left + (button_width / 2) - 12, t_button_top - 22, m_button_right, t_button_top - 12], '9:16'),
label3 = panel.add('statictext', [r_button_left + (button_width / 2) - 10, t_button_top - 22, r_button_right, t_button_top - 12], '1:1'),
button0 = panel.add('button', [l_button_left, t_button_top, l_button_right, t_button_bottom], '23.976 fps'),
button1 = panel.add('button', [l_button_left, m_button_top, l_button_right, m_button_bottom], '24 fps'),
button2 = panel.add('button', [l_button_left, b_button_top, l_button_right, b_button_bottom], '30 fps'),
button9 = panel.add('button', [l_button_left, m_button_top, l_button_right, m_button_bottom], '59.94 fps'),
button10 = panel.add('button', [l_button_left, b_button_top, l_button_right, b_button_bottom], '60 fps'),
button3 = panel.add('button', [m_button_left, t_button_top, m_button_right, t_button_bottom], '23.976 fps'),
button4 = panel.add('button', [m_button_left, m_button_top, m_button_right, m_button_bottom], '24 fps'),
button5 = panel.add('button', [m_button_left, b_button_top, m_button_right, b_button_bottom], '30 fps'),
button11 = panel.add('button', [l_button_left, m_button_top, l_button_right, m_button_bottom], '59.94 fps'),
button12 = panel.add('button', [l_button_left, b_button_top, l_button_right, b_button_bottom], '60 fps'),
button6 = panel.add('button', [r_button_left, t_button_top, r_button_right, t_button_bottom], '23.976 fps'),
button7 = panel.add('button', [r_button_left, m_button_top, r_button_right, m_button_bottom], '24 fps'),
button8 = panel.add('button', [r_button_left, b_button_top, r_button_right, b_button_bottom], '30 fps');
button13 = panel.add('button', [l_button_left, m_button_top, l_button_right, m_button_bottom], '59.94 fps'),
button14 = panel.add('button', [l_button_left, b_button_top, l_button_right, b_button_bottom], '60 fps'),
win.bounds = [0,0,win_width,win_height];
win.center();
win.show();
button0.onClick = function()
{
app.project.items.addComp('horizontal - 16:9 - 23.976', 1920, 1080, 1, 30, 23.976);
}
button1.onClick = function()
{
app.project.items.addComp('horizontal - 16:9 - 24', 1920, 1080, 1, 30, 24);
}
button2.onClick = function()
{
app.project.items.addComp('horizontal - 16:9 - 30', 1920, 1080, 1, 30, 30);
}
button3.onClick = function()
{
app.project.items.addComp('vertical - 9:16 - 23.976', 1080, 1920, 1, 30, 23.976);
}
button4.onClick = function()
{
app.project.items.addComp('vertical - 9:16 - 24', 1080, 1920, 1, 30, 24);
}
button5.onClick = function()
{
app.project.items.addComp('vertical - 9:16 - 30', 1080, 1920, 1, 30, 30);
}
button6.onClick = function()
{
app.project.items.addComp('square - 1:1 - 23.976', 1080, 1080, 1, 30, 23.976);
}
button7.onClick = function()
{
app.project.items.addComp('square - 1:1 - 24', 1080, 1080, 1, 30, 24);
}
button8.onClick = function()
{
app.project.items.addComp('square - 1:1 - 30', 1080, 1080, 1, 30, 30);
}
button9.onClick = function()
{
app.project.items.addComp('horizontal - 16:9 - 59.94', 1920, 1080, 1, 30, 59.94);
}
button10.onClick = function()
{
app.project.items.addComp('horizontal - 16:9 - 60', 1920, 1080, 1, 30, 60);
}
button11.onClick = function()
{
app.project.items.addComp('vertical - 9:16 - 59.94', 1080, 1920, 1, 30, 59.94);
}
button12.onClick = function()
{
app.project.items.addComp('vertical - 9:16 - 60', 1080, 1920, 1, 30, 60);
}
button13.onClick = function()
{
app.project.items.addComp('square - 1:1 - 59.94', 1080, 1080, 1, 30, 59.94);
}
button14.onClick = function()
{
app.project.items.addComp('square - 1:1 - 60', 1080, 1080, 1, 30, 60);
}
I work in Debian, and I add "postgres" user to "myuser"'s group that own the folder:
usermod -a -G myuser postgres
then I use chmod for give access to the user of the group, and create the table space folder
chmod 775 /home/myuser
su postgres
mkdir /home/myuser/ . . . /TableSpaceFolder
and finally use the command in "psql"
postgres=# create tablespace ts_name owner role_name location '/home/mysuer/ . . . /TableSapaceFolder';
CREATE TABLESPACE
Work for me.
I am facing issue while integrating the app, as it is not able to detect the gestures when integrated in app.
I have done in two ways
One is using tensorflite model, the model I tested it's working fine , but after integration the gestures are not getting detected.
Second is using flask api backend this approach also is not working
i face also a problem with CORS policy. Access to font at 'https://tadiktia.com/wp-content/uploads/elementor/google-fonts/fonts/forum-6aey4ky-vb8ew8iropi.woff2' from origin 'https://www.tadiktia.com' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. i install a pluging better search replace to change the URL. i also change manualy the urls in appearance--> general. i also add in .htaccess way,
<IfModule mod_headers.c>
Header set Access-Control-Allow-Origin "https://www.tadiktia.com"
Header set Access-Control-Allow-Methods "GET, POST, OPTIONS"
Header set Access-Control-Allow-Headers "Content-Type"
</IfModule>
and
RewriteEngine On
RewriteCond %{HTTP_HOST} ^tadiktia.com [NC]
RewriteRule ^(.*)$ https://www.tadiktia.com/$1 [L,R=301]
And in funtions.php :
function add_cors_http_header(){
header("Access-Control-Allow-Origin: *");
}
add_action('init', 'add_cors_http_header');
And still i didnt resolve it... Please has anyone face the same or can help me? thank you very much in advance!!!
While we encourage users to express themselves in their profiles, all user profiles in their entirety are subject to the Code of Conduct and all policies outlined or incorporated therein.
DbSet is an object managed inside DbContext and cannot be injected directly
It is currently not possible to autocomplete variable names in prompts in Cursor (source).
Removing Trusted_Connection=True from your connnection String helped me to resolve this issue.
The problem could be compatibility issue. Check your sql to python connector if it is latest version
Firefox doesn't support module ServiceWorkers, yet. Here's the issue Implement "module" service workers.
Here are two solutions:
1、setting the time out in the web server ,such as in nginx config file。
proxy_connect_timeout 15m;
2、use queue in NodeJS,when the API is running ,you can add the task to queue , and return to browser immediately。the task was run in the queue background
Thumbnail and likebutton are components that react didn't put their code in its website
Well I ended up figuring it out soon after posting this thanks to this thread. I ended up changing:
Command="{Binding Source={x:Reference this}, Path=BindingContext.MyCommand}}"
to
Command="{Binding Source={RelativeSource AncestorType={x:Type viewmodel:MyViewModel}}, Path=MyCommand, x:DataType=viewmodel:MyViewModel}"
NoSuchKeyThe specified key does not exist.No such object: android-build/builds/aosp-master-throttled-copped-linux-aosp_cf_arm64_only_phone-userdebug/10425810/fc4f2c3f6bdafa220a856df60df471054659ab05e9bf70112375b704d2be6848/aosp_cf_arm64_only_phone-img-10425810.zip
There is no img to download anymore.
Is the repo discarded?
Give a try with Updating Python Version.
In my case I had to run: expo export --platform web
Then you can see the files inside a directory called dist
Upload these files on a file server.
Official Documentation: how to publish websites on Netlify, Firebase, Github Pages, etc
SOLVED (using a combination of 3CxEZiVlQ and Useless's answers:
#include <cstdlib>
template <typename E>
class ABag {
int used;
E* data;
public:
E& operator[](size_t idx) { // from 3CxEZiVlQ
if (idx >= used)
{exit(-1);}
return data[idx];
}
};
template <typename Key, typename E>
class KVpair {};
template <typename Key, typename E>
class BDictionary {
public:
bool find(const Key& k, E& rtnval) const {
for (size_t cur_pos = 0; cur_pos < 0; cur_pos++) {
KVpair<Key, E> temp = (*dictionary)[cur_pos]; // from Useless
}
return false;
}
private:
ABag<KVpair<Key, E>> *dictionary;
};
int main() {
BDictionary<int, int> d;
int rv = 0;
d.find(0, rv);
}
Thanks! Really appreciate the help.
from PIL import Image
# Load the uploaded image
image_path = "/mnt/data/IMG_20230823_005640.jpg"
image = Image.open(image_path)
# Display basic info about the image
image.size, image.mode
i built a free api for this old issue
Please update to 3.29.1 which is a hotfix for this and some other issues as well detailed here.
does this solution work for your case?
q)t:([]date:20?2011.03.01+til 10;uid:20?17000+til 10;sym:20?50000+til 15)
q)7#t
date uid sym
----------------------
2011.03.09 17008 50001
2011.03.02 17003 50000
2011.03.01 17004 50000
2011.03.08 17001 50010
2011.03.04 17006 50001
2011.03.04 17004 50010
2011.03.07 17004 50007
q)updCols:{`$string[x],\:string[y]}
q)prevCols:{if[x=0;:`date`uid`sym]; updCols[;x] `prevdate`prevuid`prevsym}
q)f:{![x;();0b;prevCols[y]!(prev;) each prevCols y-1]}
q)7#f/[t;1+til 6]
date uid sym prevdate1 prevuid1 prevsym1 prevdate2 prevuid2 prevsym2 prevdate3 prevuid3 prevsym3 prevdate4 prevuid4 prevsym4 prevdate5 prevuid5 prevsym5 prevdate6 prevuid6 prevsym6
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
2011.03.09 17008 50001
2011.03.02 17003 50000 2011.03.09 17008 50001
2011.03.01 17004 50000 2011.03.02 17003 50000 2011.03.09 17008 50001
2011.03.08 17001 50010 2011.03.01 17004 50000 2011.03.02 17003 50000 2011.03.09 17008 50001
2011.03.04 17006 50001 2011.03.08 17001 50010 2011.03.01 17004 50000 2011.03.02 17003 50000 2011.03.09 17008 50001
2011.03.04 17004 50010 2011.03.04 17006 50001 2011.03.08 17001 50010 2011.03.01 17004 50000 2011.03.02 17003 50000 2011.03.09 17008 50001
2011.03.07 17004 50007 2011.03.04 17004 50010 2011.03.04 17006 50001 2011.03.08 17001 50010 2011.03.01 17004 50000 2011.03.02 17003 50000 2011.03.09 17008 50001
The javax.print.PrintServiceLookup class itself relies on the underlying operating system and network configuration to discover and interact with printers. So you can either try to inject the settings from your host computer (if its a unix based system) or you have to install the printer inside your container. The latter would even decouple the execution of the container from the executing host system.
Maybe this article is helping you out: https://www.alecburton.co.uk/2017/printing-from-a-docker-container/
I think your examples help answer your question. As a general rule of thumb:
if the tree is binary, your recursive call usually advances to the index (i+1).
If elements can be used multiple times, j is passed unchanged. to allow for repetition and reusing of elements.
If each element is used only once, and your tree is not binary, pass (j+1) to avoid reusing elements.
Since iOS 15, the up down chevron icon of a Picker can be hidden using the .menuIndicator modifier:
.menuIndicator(.hidden)
Can you try findFirstByUsernameOrderByUsername i think JPA expects something after by but then you continue with OrderBy
Phone accuracy errors, while funny, do not mask the intended phrase simply because the speech decoding of the human brain is awsom!
But the only reason to use phoemes over voice to text is limited processing power on low power Lora transmissions. If you can deal with the very mechanical phoneme based voices. Which obviously anonmoize the speaker.
From the image shared, seems that you have issues with your Flutter SDK installation. Try reinstalling it.
My god the problem was so banal it is stupid.
Apparently in my Excel version (seemingly 2021) the criteria are divided by semicolons in the syntax.........
It is working now. Thanks for your effort though
Providing an update regarding the adding Test Users step because the UI has changed a bit. You can find the Test Users section by navigating to APIs & Services -> OAuth concent screen -> Audience
Вот возможное решение для тех у кого Windows, внимательно читайте какие папки или заголовочные файла не видит, ищите их и добавляйте, я решил именно так
"name": "Win32",
"includePath": [
"${default}",
"C:/msys64/ucrt64/include/gtk-4.0",
"C:/msys64/ucrt64/include/pango-1.0",
"C:/msys64/ucrt64/include/fribidi",
"C:/msys64/ucrt64/include/harfbuzz",
"C:/msys64/ucrt64/include/gdk-pixbuf-2.0",
"C:/msys64/ucrt64/include/cairo",
"C:/msys64/ucrt64/include/freetype2",
"C:/msys64/ucrt64/include/libpng16",
"C:/msys64/ucrt64/include/pixman-1",
"C:/msys64/ucrt64/include/graphene-1.0",
"C:/msys64/ucrt64/include/glib-2.0",
"C:/msys64/ucrt64/lib/glib-2.0/include",
"C:/msys64/ucrt64/include/graphene-1.0",
"C:/msys64/ucrt64/lib/graphene-1.0/include"
],
A simplified version of Alex answer
import threading
import time
lock=threading.Lock()
def thread1cb():
lock.acquire() # thread1 acquires lock first
time.sleep(1)
print("hello")
lock.release()
def thread2cb():
time.sleep(0.1)
lock.acquire() # thread2 will wait on this line until thread1 has released the lock it acquired
print("there")
lock.release()
thread1=threading.Thread(target=thread1cb)
thread2=threading.Thread(target=thread2cb)
thread1.start()
thread2.start()
thread1.join() # As long as thread1 acquires & releases the lock first, you could safely remove this line. threading.Thread(...).join() waits until the target function of the thread has returned.
thread2.join()
Output will be:
hello
there
If you comment out the lock.acquire() & lock.release() lines, it will instead print:
there
hello
Docs: https://docs.python.org/3/library/threading.html#threading.Lock.acquire
https://docs.python.org/3/library/threading.html#using-locks-conditions-and-semaphores-in-the-with-statement
Stack overflow is like a coding discussion site
you can ask about errors, questions and just share code
think of it as a developer reddit/twitter
recommended to use markdown too.
//this is a codeblock
According to the release notes, Numpy 1.26.4 doesn't support Python 3.13, so yes, if you need version 1.x.y (which at least one of your other libraries seems to be requiring), you should downgrade to Python 3.12.
I discovered I had the date: wrong. I had it as results.startDate, and when I flipped it to resuluts.endDate, the data produced was correct.
Thanks.
Blessings, --Mark
I use psp to create a Django project and also, a docker image with the Django project that you want develop, step by step...
This is reference that explains Docker image: https://psp.readthedocs.io/en/latest/simple/#dockerpodman
Run this in your shell:
$ psp
info: welcome to psp, version 0.2.0
> Name of Python project: django-scaffold
> Do you want to create a virtual environment? Yes
> Do you want to start git repository? Yes
> Select git remote provider: None
> Do you want unit test files? Yes
> Install dependencies: django django-admin startproject hello_world_django
> Select documentation generator: None
> Do you want to configure tox? No
> Do you want create common files? Yes
> Select license: MIT
> Do you want to install dependencies to publish on pypi? Yes
> Do you want to create a Dockerfile and Containerfile? Yes
info: python project `django-scaffold` created at `/tmp/django-scaffold`
$ cd django-scaffold && docker build django-scaffold
import FaceDetection from '@react-native-ml-kit/face-detection'
const result = await FaceDetection.detectFromFile(imageUri, {
performanceMode: 'fast',
landmarkMode: 'none',
classificationMode: 'none',
});
I finally got it to work by eliminating some of the white space like this:
subprocess.run(["ssh", IP, "/usr/bin/gpio", "write 0 1"])
I still do not quite understand why ssh was complaining, rather than bash.
<Tooltip
active={true}
wrapperStyle={{ pointerEvents: 'auto' }}
content={content}
/>
Set the active prop active to true. That's what worked me.
You could write something like this: (if you're using mockk)
mockk<HttpResponse>(relaxed = true) {
every { status } returns 200
every { rawContent } returns ByteReadChannel("body")
}
As the error mentions, your database ID is probably incorrect. Try logging the value to verify.
Often times, the reason why these values are invalid are because an environment variable is not set properly.
You can also use the <object> tag:
<object data="html/stuff_to_include.html">
Your browser doesn’t support the object tag.
</object>
Learn more at MDN.
<object> is currently supported in most browsers.
Your problem is to partition the storage drive using an operating system answer file (autounattend.xml).
The solution you proposed to use is the diskpart tool, called by powershell, in one of the ISO image installation phases.
I started using some answer file generators, such as "schneegans.de" (https://schneegans.de/windows/unattend-generator/), suggested by Vern_Anderson. Although it is an excellent tool, the problem is that you are tied to it. If you need a custom solution, you don't need support. The ideal is to try to understand how a tool can solve the problem.
In the case of the "schneegans.de" tool, it participated in the "windowsPE" phase. It doesn't make sense to install and then resize. That's why the "windowsPE" phase is used for partitioning. The operating system installation is after the "windowsPE" phase. Avoid using the "specialize" or "oobeSystem" phases for partitioning.
Now comes the first question: after a basic installation of the operating system, does your routine work? If the answer is yes, we can "try" to put the routine in the answer file. But is it possible? The tool "schneegans.de" reports that it is possible, but the "windowsPE" phase is quite restricted. The tool configures it as follows:
<settings pass="windowsPE">
<component name="Microsoft-Windows-Setup" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS">
<UseConfigurationSet>false</UseConfigurationSet>
<RunSynchronous>
<RunSynchronousCommand wcm:action="add">
<Order>1</Order>
<Path>cmd.exe /c ">>"X:\diskpart.txt" (echo SELECT DISK=0&echo CLEAN&echo CONVERT GPT&echo CREATE PARTITION EFI SIZE=499&echo FORMAT QUICK FS=FAT32 LABEL="System"&echo CREATE PARTITION MSR SIZE=16)"</Path>
</RunSynchronousCommand>
<RunSynchronousCommand wcm:action="add">
<Order>2</Order>
<Path>cmd.exe /c ">>"X:\diskpart.txt" (echo CREATE PARTITION PRIMARY&echo SHRINK MINIMUM=699&echo FORMAT QUICK FS=NTFS LABEL="Windows"&echo CREATE PARTITION PRIMARY&echo FORMAT QUICK FS=NTFS LABEL="Recovery")"</Path>
</RunSynchronousCommand>
<RunSynchronousCommand wcm:action="add">
<Order>3</Order>
<Path>cmd.exe /c ">>"X:\diskpart.txt" (echo SET ID="de94bba4-06d1-4d40-a16a-bfd50179d6ac"&echo GPT ATTRIBUTES=0x8000000000000001)"</Path>
</RunSynchronousCommand>
<RunSynchronousCommand wcm:action="add">
<Order>4</Order>
<Path>cmd.exe /c "diskpart.exe /s "X:\diskpart.txt" >>"X:\diskpart.log" || ( type "X:\diskpart.log" & echo diskpart encountered an error. & pause & exit /b 1 )"</Path>
</RunSynchronousCommand>
</RunSynchronous>
</component>
</settings>
Note the complexity. The XML response file needs to be encoded in the "EncodeHTML" format. So, the ">" symbol becomes ">". The "&" symbol becomes "&". The tool creates the file "X:\diskpart.txt", executing the redirection line by line with the "cmd.exe" command. The last command is to call the "diskpart.exe" tool, passing as a parameter the created file "X:\diskpart.txt".
There is no need to edit the ISO image. I do not recommend this solution. In this case, the tool placed the "code" in the response file, with the limitation of the "EncodeHTML" encoding.
But if I want to inject files, how can I do it? In this case, the best tool is "Ventoy" (https://www.ventoy.net/en/plugin_injection.html). With this tool, you can pass the response file (https://www.ventoy.net/en/plugin_autoinstall.html), in addition to being able to inject files.
If you still want to use the powershell code, which calls diskpart.exe, in the "specialize" or "oobeSystem" phases, the "schneegans.de" tool has this feature. But I suggest you explore it to understand how it is done. I can tell you in advance that the procedure is quite complex.
Here is the solution that I understand to be the most appropriate for your problem. You do not need external commands to partition. The answer file itself does it for you. For example, I create drives C: and D:. For Windows 11, I will need 5 partitions: ESP, WinRE, MSR, Applications and Users. Exactly what you read. I place the user profiles in the "D:\Usuários" folder, in the native language (Brazilian Portuguese).
2.1) Setting up the partitions
<settings pass="windowsPE">
<DiskConfiguration>
<Disk wcm:action="add">
<DiskID>0</DiskID>
<WillWipeDisk>true</WillWipeDisk>
<CreatePartitions>
<!-- Sistema (ESP) -->
<CreatePartition wcm:action="add">
<Order>1</Order>
<Type>EFI</Type>
<Size>499</Size>
</CreatePartition>
<!-- Recuperação -->
<CreatePartition wcm:action="add">
<Order>2</Order>
<Type>Primary</Type>
<Size>699</Size>
</CreatePartition>
<!-- Reservada -->
<CreatePartition wcm:action="add">
<Order>3</Order>
<Type>MSR</Type>
<Size>99</Size>
</CreatePartition>
<!-- Sistema operacional -->
<CreatePartition wcm:action="add">
<Order>4</Order>
<Type>Primary</Type>
<Size>102400</Size>
</CreatePartition>
<!-- Dados -->
<CreatePartition wcm:action="add">
<Order>5</Order>
<Type>Primary</Type>
<Extend>true</Extend>
<!-- <Size>102400</Size> -->
</CreatePartition>
</CreatePartitions>
<ModifyPartitions>
<!-- Sistema (ESP) -->
<ModifyPartition wcm:action="add">
<Order>1</Order>
<PartitionID>1</PartitionID>
<Label>ESP</Label>
<Format>FAT32</Format>
</ModifyPartition>
<!-- Recuperação -->
<ModifyPartition wcm:action="add">
<Order>2</Order>
<PartitionID>2</PartitionID>
<Label>WINRE</Label>
<Format>NTFS</Format>
<TypeID>DE94BBA4-06D1-4D40-A16A-BFD50179D6AC</TypeID>
</ModifyPartition>
<!-- Reservada -->
<ModifyPartition wcm:action="add">
<Order>3</Order>
<PartitionID>3</PartitionID>
</ModifyPartition>
<!-- Sistema operacional -->
<ModifyPartition wcm:action="add">
<Order>4</Order>
<PartitionID>4</PartitionID>
<Label>SO</Label>
<Letter>C</Letter>
<Format>NTFS</Format>
</ModifyPartition>
<!-- Dados -->
<ModifyPartition wcm:action="add">
<Order>5</Order>
<PartitionID>5</PartitionID>
<Label>Aplic</Label>
<Letter>D</Letter>
<Format>NTFS</Format>
</ModifyPartition>
</ModifyPartitions>
</Disk>
</DiskConfiguration>
<ImageInstall>
<OSImage>
<Compact>false</Compact>
<InstallFrom>
<MetaData wcm:action="add">
<Key>/image/name</Key>
<Value>Windows 11 Pro</Value>
</MetaData>
</InstallFrom>
<InstallTo>
<DiskID>0</DiskID>
<PartitionID>4</PartitionID>
</InstallTo>
</OSImage>
</ImageInstall>
<UserData>
<ProductKey>
<Key>VK7JG-NPHTM-C97JM-9MPGT-3V66T</Key>
<WillShowUI>OnError</WillShowUI>
</ProductKey>
<AcceptEula>true</AcceptEula>
</UserData>
</component>
</settings>
2.2) Modifying the user profile folder
<settings pass="oobeSystem">
<component name="Microsoft-Windows-Shell-Setup" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS">
<FolderLocations>
<ProfilesDirectory>D:\Usuários</ProfilesDirectory>
</FolderLocations>
<OOBE>
<ProtectYourPC>3</ProtectYourPC>
<HideEULAPage>true</HideEULAPage>
<HideWirelessSetupInOOBE>true</HideWirelessSetupInOOBE>
<HideOnlineAccountScreens>true</HideOnlineAccountScreens>
</OOBE>
</component>
</settings>
Please note that the profiles folder has an accented character. Therefore, I needed to encode the symbol "á" to "á". I put this item just to emphasize to you that if you put the code embedded in the XML, you will need to encode it in the "EncodeHTML" format.
Remember to use Ventoy. It is the best tool to initialize an image, with response file and injected files. And without having to touch the original image!
I used an online translator. I apologize for not being fluent in the language.
I've been watching other firebase related repositories. And I think I got an answer (solution) for this issue:
To update the phone number that is used as second factor, proceed as follows: login the user again with MFA (e.g. use signInWithEmailAndPassword for email as first factor, followed by verifyPhoneNumber for the second factor) when the MFA login successfully completed, enroll the new phone number (again using verifyPhoneNumber, but with the new phonenumber as input parameter) when the new phone number is successfully added, unenroll the original phone number (using the MultiFactor.unenroll -> https://pub.dev/documentation/firebase_auth/latest/firebase_auth/MultiFactor/unenroll.html) More details on step 1) and 2) can be found at https://firebase.google.com/docs/auth/flutter/multi-factor
"ORA-29285 - File write error"
this error occured when your HARD DRIVE is FULL where you are exporting data.
A custom application to convert the standard Gregorian date and calendar of the comprehensive ErpNext version 15 software to the solar (Jalali) calendar. This application works in all date and date-time fields, in all sections of the software, and also displays the Gregorian equivalent of the selected date below the field.
اپلیکیشن سفارشی برای تبدیل تاریخ و تقویم استاندارد میلادی نرم افزار جامع ErpNext نسخه 15 به تقویم شمسی (جلالی) می باشد. این اپلیکیشن ، در تمام فیلدهای تاریخ و تاریخ-ساعت ، در تمام بخشهای نرم افزار عمل می کند و همچنین ، معادل میلادی تاریخ انتخاب شده را در زیر فیلد نمایش می دهد.
Yes, in Node.js it's possible to import CommonJS modules from ECMAScript modules. The documentation has more information.
This has been the case for as long as Node.js has supported ECMAScript module syntax (I just tested to verify that).
`sudo sh -c 'echo "deb [signed-by=/usr/share/keyrings/Unity_Technologies_ApS.gpg] https://hub.unity3d.com/linux/repos/deb stable main" > /etc/apt/sources.list.d/unityhub.list'wget -qO - https://hub.unity3d.com/linux/keys/public | gpg --dearmor | sudo tee /usr/share/keyrings/Unity_Technologies_ApS.gpg > /dev/null sh -c 'echo "deb [signed-by=/usr/share/keyrings/Unity_Technologies_ApS.gpg] https://hub.unity3d.com/linux/repos/deb stable main" > /etc/apt/sources.list.d/unityhub
=BYROW(J162:J164,LAMBDA(a,REGEXEXTRACT(a,"\(\d{3}\)\s\d{3}-\d{4}")))
Meanwhile some users can apply REGEXEXTRACT in Excel for the web and Office 365.
Recently I wrote a nice utility for serializing objects with saving external and internal links https://github.com/nerd220/JSONext
Easy to use (in this case):
var serialization=toLinkedJSON({meta:meta});
var newMeta=fromLinkedJSON(serialization);
console.log(newMeta.meta[0].link==newMeta.meta[1]); //true
Apparently below is what needed. Just setting context.res is not enough, empty return is not enough.
return {
status: 200, body: {}
}
I think your issue here is the path of that cookie.
As in the protected route you are calling is the /dashboard but your cookie has the path /authentication/* so the middleware doesnt get it since its a RSC? maybe?.
I have the same issue thinking that its the http-only part but I have another cookie that isn't http-only with the path /api/* and it doesn't appear either on my frontend the only cookie that works on the middleware is my session_id with the path / and http-only being true. this being the case you might need your cookie to be a empty path to work properly if the routes dont match.
A temporary `Car(1)` is being pushed into the vector, the copy constructor is called. C++ adds the item to the vector using the copy constructor because your class lacks a move constructor. The vector must create a copy in order to store it, even though it appears that you are not copying.
Could you share a bit more of what you have "X" out in your image. Is it just the filename, also, are there additional details after all of the module names and paths?
After several attempts, I have concluded the following points:
Using "path: '404-page-not-found', renderMode: RenderMode.Server, status: 404" in the app.routes.server.ts file is part of the solution.
I trigger a redirection to the 404-page-not-found page from the server.ts file rather than from the component.ts file, else if handled at the component.ts level, the page content changes, but the response status does not.
server.ts update :
const entityRegex = /\/entity\//;
const knownEntities = ['123456789-entity', '987654321-entity'];
app.use('/**', (req, res, next) => {
if (entityRegex.test(req.baseUrl)) {
let entity = req.baseUrl.substring(8);
console.log('entity:', entity);
if (knownEntities.includes(entity)) {
next();
} else {
res.redirect('/404-page-not-found');
}
} else {
next();
}
});
This checks whether the entity exists:
next().status: 404 in app.routes.server.ts ensures that a 404 status code is returned.)Pros & Cons
✅ Advantage: A proper 404 status is returned.
❌ Drawback: The server.ts file requires a knownEntitiesList, or an API call to verify existence, which can introduce additional latency. If the entity exists, this approach results in two API calls instead of one (one server-side check and another browser-side request).
An example is available in StackBlitz
Know with this method using curl you get:
404 status for /entity/000000000-entity.200 status for /entity/123456789-entity.What's up with all the people in this thread that get philosophical about what is garbage and what is collecting? This is computer science, terms have specific meanings given to them by whoever coined them. Garbage collection is not "any system where you don't have to say "free" to free the memory". No, garbage collection is a very specific strategy that has the following characteristics:
Memory has to be requested to the GC for it to be tracked by the GC. The GC gets memory from the OS, keeps track of its existence, and gives you a handle to that memory.
The GC periodically checks which GC handles are still reachable.
Usually, using non-GC managed values that can access GC managed values requires notifying the GC about it, so that an exception is made.
Put it simply, the GC acts like a kind of pseudo-OS between you and the OS, that is much smarter and can recognize at runtime when a value in memory is no longer needed and free it. This is not the same as strategies such as reference counting, where you create individual handles that point to a location in memory and have a destructor that frees the memory if a certain condition is found. This is just automated manual memory management, and you can still f* up (e.g. if you create two shared pointers that point to each other, none of them will ever delete their value, even though both are unreachable).
Sorry for not answering the question. Many people have done that already. I'm just writing this down because there's an impressive amount of answers saying "akchually Rust is garbage collected if we count turning on the computer as "garbage collection"", which is just gonna confuse people.
The solution was page.window.width, page.window.height and page.window.resizable
Discord was able to help with that
You can access to is url in your navigator ?
Or when you run your code the profiler tell what ?
When a user logs out and hits the back button, browsers restore the previous page state, including localStorage values. That's why your token "reappears" despite clearing it.
During logout, record a timestamp of when the user logged out In your auth check, verify that any token was created AFTER the most recent logout Modify browser history using replaceState to prevent returning to authenticated states Consider using sessionStorage instead of localStorage
This timestamp approach ensures that even if old tokens reappear due to browser navigation, they'll fail validation because they were created before the last logout.
You can try to use space and new line symbol
Like:
print('Line_1 \nLine_2 \nLine_3 \n')
Sorry, I am not aware of any direct API reference that I could suggest here and I have never worked with Lucene.
However, I am aware that Google Desktop uses a Query API to rank and suggest the relevant search results. More information on the API can be found here.
Perhaps others could chime in and guide you.
This is wrong now
Install from here 34.2.13 stable 30 April 2024 https://developer.android.com/studio/emulator_archive
This version of the emulator behave correctly on Linux.
I saw your new car on Instagram. Seriously? 😒 Yeah, it’s amazing, right? Amazing? Jake, it’s so unnecessary. You already have a perfectly good car! I’ve been saving up for it. It’s something I really wanted. I get that, but why didn’t you talk to me about it first? It’s a huge purchase! I wanted to surprise you with it. A surprise? You’ve been making decisions like this without me. It feels like you don’t care about our plans. I care about us. It’s not like I’m ignoring you. I thought you’d be happy for me. It’s not about the car, Jake. It’s about you not including me. We’ve talked about saving for things we both want, and now it feels like this wasn’t considered at all. I’m sorry. I should’ve talked to you first. I didn’t mean to upset you. I just don’t want to feel like I’m left out of important decisions. I understand. I’ll make sure to include you next time. Can we talk later? Yeah, we need to.
I want to suggest another solution https://github.com/nerd220/JSONext
It allows you to easily serialize objects, saving internal and external links, methods and prototypes.
In subject case:
var node1={data: 'some data'};
var node2={data: 'else data');
node1.link=node2;
node2.link=node1;
var tree={node1: node1, node2: node2};
var serialize=toLinkedJSON(tree);
var newTree=fromLinkedJSON(serialize);
console.log(newTree.node1.link==newTree.node2);//true, because links are saved
You can use this tool I created
Okay, the most refined solution so far is to do the following:
.refreshable { [weak viewModel] in
viewModel?.action()
}
Most hosting providers like A2 Hosting (and others) configure their email services to allow connections only from their own servers or from trusted IP addresses (such as those within their network). When you're working remotely (on your local machine), the connection to the SMTP server might be blocked, which is likely why you're experiencing delays or failure when testing locally.
l am looking to do the same thing but my api json response doesn’t have that [data] attribute, how should it be handled? Thank you in advance.
Setting a file in `~/.vim/bundle/YouCompleteMe/.ycm_extra_conf.py`
import os
import ycm_core
flags = [ '-std=c++20' ]
def Settings ( **kwargs ):
return { 'flags': flags }
I strongly suggest to use this library to achieve bidirectional binding both for reading and writing query params: @geckosoft/ngx-query-params
It's very useful when you are trying to read/write i.e. pagination params from URL query params, in just a one-line code.
Might be useful to someone! 🙂
Open SSMS -> Connect
Server name: (LocalDb)\MSSQLLocalDB,
@deype0 what kind of call do we have to make using the graph API explorer. I just added all permissions and sent a GET request to id name...
Just remove the following line from the schema.prisma file:
output = "../generated/prisma"
In case anyone else winds up here - they haven't updated their API documentation to reflect the actual package from NuGet. The correct code is as follows:
MailjetRequest req = new MailjetRequest();
var client = new MailjetClient(sendGridAPI, sendGridSecret);
TransactionalEmail email = new TransactionalEmail();
email.TextPart = "Text email goes in here";
email.HTMLPart = "<h1>Hello</h1>Your html email goes in here";
email.From = new SendContact("[email protected]", "That fellow");
List<SendContact> singleSend = new List<SendContact>();
singleSend.Add(new SendContact("[email protected]"));
email.To = singleSend;
var resp = await client.SendTransactionalEmailAsync(email);
Not sure why people think it's acceptable to have sample code that literally doesn't work in their up-to-date API documentation. But hey, I guess that's what the internet is for?
yah experiencing same thing with HAPI-FHIR w/ observations. No delay with creating patients.
As said here, its settings are controlled by the jitsi team so only by changing the server for an available one like the comment said or hosting your own can you use it without a moderator :
You cannot control authentication on meet.jit.si because that's a deployment we maintain. You should have your own deployment and then you can choose what type of auth you want.
I want to suggest another solution https://github.com/nerd220/JSONext
It allows you to easily serialize objects, saving internal and external links, methods and prototypes.
In subject case:
var p1 = new Person(77);
var serialize=toLinkedJSON(p1,[],['Person']);
var p2 = fromLinkedJSON(serialize);
p2.isOld(); // true, now this method is works
detaching the egg package worked for me incase still someone wanted to know.
detach("package:egg", unload = TRUE)
@timbre's answer makes many good points and is worth an up-vote. But it didn't answer exactly what I needed. So here's what I came up with.
See this article for how I came up with 6.0 corresponding to target SDK macOS 15
#if swift(>=5.0)
...swift code...
#else
...old approach...
#endif
#if canImport(SwiftUI, _version: "6.0") //hack to test for macOS 15 target sdk
...swift code...
...perhaps using #available if deployment sdk is older...
#else
...old approach...
#endif
When I try to do Item 1 (Under 3 above) where it says "In Excel, go to Data ribbon → Get Data → From Other Sources → From Microsoft Query", I don't find "From Microsoft Query". All I get is From Table/Range, From Web, From OData Feed, From ODBC, From OLEDB, From Picture, and Blank Query. The stated option → From Microsoft Query isn't there. So I still can't get the query. What am I missing here?
when I use this CSS :
list-style: none;
padding-left: 0;
}
ul li:before {
content: '✓';
}
in elementor it drops the line. I do not see it in alignment with the content <li> Is there a way to fix this?
It will be easier to create similar shapes in figma or adobe illustrator, then import SVGs into the project. Making icons with code is irrational, in any case, it will be more convenient in the visual editor.
where a strict MVC separation is not that much of a concern.
To emphasise that - when building an MVC app, the QListWidget cannot have the model set - the setModel() method is private. You have to use QListView to get all the items to sync up.
I have developed a free VS Code (and Cursor) extension that supports both Windows and MacOS systems. See https://github.com/hanlulong/stata-mcp. This extension provides Stata integration for Visual Studio Code and Cursor IDE using the Model Context Protocol (MCP). It allows you to:
Run Stata commands directly from VS Code or Cursor
Execute selections or entire .do files
View Stata output in the editor in real-time
Get AI assistant integration through the MCP protocol
Experience enhanced AI coding in Cursor IDE with Stata context
Feedback is welcome.
I know this is a very old question, but maybe this will help someone... I had to insert columns into PostgreSQL tables at specific positions many times, so I wrote a Python script for it. It follows the column rotation approach in this blog post, which was also referred to earlier.
Limitations include:
Foreign keys, constraints, indexes etc. will need to be recreated if they apply to columns behind the new one
It is tested with psycopg 3.2.5 and Python 3.13. Other versions may well work, but you would have to try it out
DB connection and script parameters are hardcoded (but then, this is not something to be widely disseminated)
It does handle all data types including arrays and default values. The parameters are hopefully self-explanatory. The new column will be at position new_column_pos after the script ran.
from psycopg import sql, connect
def insert_pg_column(conn, table_name, new_column_name, new_column_type, new_column_position):
cur = conn.cursor()
# Get column names and types from the table
cur.execute(
sql.SQL(
"SELECT column_name, data_type, udt_name, character_maximum_length, column_default "
"FROM information_schema.columns WHERE table_name = %s ORDER BY ordinal_position"
),
[table_name],
)
columns = cur.fetchall()
print(f"Retrieved definitions for {len(columns)} columns")
# Remove from list all columns which remain unchanged
columns = columns[new_column_position - 1 :]
column_names = [col[0] for col in columns]
# Add the new column to the table (at the end)
cur.execute(
sql.SQL("ALTER TABLE {} ADD COLUMN {} {}").format(
sql.Identifier(table_name), sql.Identifier(new_column_name), sql.SQL(new_column_type)
)
)
print(f"Added new column '{new_column_name}' to table '{table_name}'")
# Create temporary columns to hold the data temporarily
temp_columns = {}
for col_name, col_type, udt_name, length, default in columns:
temp_col_name = f"{col_name}_temp"
temp_columns[col_name] = temp_col_name
# Handle array types
if col_type == "ARRAY":
if udt_name.startswith("_"):
data_type = f"{udt_name[1:]}[]" # Remove the leading underscore
else:
data_type = f"{udt_name}[]" # Not sure this ever happens?
else:
data_type = col_type
if length is not None: # For character types
data_type += f"({length})"
cur.execute(
sql.SQL("ALTER TABLE {} ADD COLUMN {} {} {}").format(
sql.Identifier(table_name),
sql.Identifier(temp_col_name),
sql.SQL(data_type),
sql.SQL("DEFAULT {}").format(sql.SQL(default)) if default is not None else sql.SQL(""),
)
)
print(f"Added temporary column '{temp_col_name}'{(" with default '" + default) + "'" if default else ''}")
# Update the temporary columns to hold the data in the desired order
for col_name in column_names:
cur.execute(
sql.SQL("UPDATE {} SET {} = {}").format(
sql.Identifier(table_name), sql.Identifier(temp_columns[col_name]), sql.Identifier(col_name)
)
)
print(f"Copied data from column '{col_name}' to '{temp_columns[col_name]}'")
# Drop the original columns
for col_name in column_names:
cur.execute(
sql.SQL("ALTER TABLE {} DROP COLUMN {}").format(sql.Identifier(table_name), sql.Identifier(col_name))
)
print(f"Dropped original column '{col_name}'")
# Rename the temporary columns to the original column names
for col_name in column_names:
cur.execute(
sql.SQL("ALTER TABLE {} RENAME COLUMN {} TO {}").format(
sql.Identifier(table_name), sql.Identifier(temp_columns[col_name]), sql.Identifier(col_name)
)
)
print(f"Renamed '{temp_columns[col_name]}' to '{col_name}'")
conn.commit()
cur.close()
if __name__ == "__main__":
# Database connection parameters
HOST = "your_host"
DATABASE = "your_dbname"
USER = "your_user" # Needs to have sufficient privileges to alter the table!
PASSWORD = "your_password"
# Parameters for adding a new column (EXAMPLE; REPLACE WITH YOUR OWN VALUES!)
table_name = "users"
new_column_name = "user_uuid"
new_column_type = "uuid"
new_column_pos = 3 # Position is 1-based index
connection = connect(f"dbname={DATABASE} user={USER} password={PASSWORD} host={HOST}")
try:
insert_pg_column(connection, table_name, new_column_name, new_column_type, new_column_pos)
print(f"Successfully added column '{new_column_name}' to table '{table_name}' at position {new_column_pos}.")
except Exception as e:
print(f"Error: {e}")
connection.rollback()
finally:
connection.close()
Construct less than a cumulative frequency for this data. Marks(x) Frequency (f) More Than Cumulative (f) 3-72129+21=508-12227+22=2913-1743+4=718-2221+2=323-2711