I managed to solve this. I had previously generated the registry entries using this command:
regasm TheApplication.dll /regfile:TheApplication.reg
Then I had imported TheApplication.reg into the registry using regedit.
Referring to the regasm documentation here: https://learn.microsoft.com/en-us/dotnet/framework/tools/regasm-exe-assembly-registration-tool
I discovered this: "The /regfile option only emits registry entries for managed classes. This option does not emit TypeLibID or InterfaceID entries."
To get it working, I did the following:
Then I used the following command to register the DLL:
regasm TheApplication.dll /tlb:TheApplication.tlb /codebase
try using this
Modifier.scale(scaleX = -1f, scaleY = 1f),
worked for me
There is an open issue regarding these error logs. However, no fix yet.
Simple solution. Below command worked for me
git for-each-ref --sort=-committerdate refs/heads/ --format='%(refname:short) - %(committerdate:short)'
I think I recall that for only updating some fields in PrestaShop the method must be PATCH and not PUT. However if my mind is not tricking me I recall that for each resource the PATCH method must be enabled. Otherwise if PUT is used I guess that every time you must update you must include all fields.
It looks like some problems with versions of libs. What version of Spring do you use in server and client?
first of all make sure you are using Medusa V2 and not V1. then follow the docs to add the relationship using the link module.
to extend functionality make use of subscribers, overriding apis, hooks, etc
most are in the website docs
Screenshot of the result from the "ls" command. The second line is made, after the db3-file is copied back from the windows pc. It is easy to see, that the linebreak is different? How can this be?
I’m facing the same issue while using MS Graph to create a webinar (v1.0/solutions/virtualEvents/webinars), and I want to know if you managed to fix the problem.
Here’s my current setup:
Request Body:
{
"displayName": "The Impact of Tech on Our Lives",
"description": {
"contentType": "text",
"content": "Discusses how technology has changed the way we communicate."
},
"startDateTime": {
"dateTime": "2024-03-30T10:00:00",
"timeZone": "UTC"
},
"endDateTime": {
"dateTime": "2024-03-30T17:00:00",
"timeZone": "UTC"
},
"audience": "everyone"
}
Headers:
Error Response:
{
"error": {
"code": "BadRequest",
"message": "Bad Request",
"innerError": {
"date": "2025-01-28T10:31:56",
"request-id": "...",
"client-request-id": "..."
}
}
}
Did you manage to resolve this issue using the virtual event webinar Graph APIs, or find another workaround?
Any advice would be appreciated!
if you are using the new version of expo, the sdk52, it updated to automatically use the expo-router as the default routing system. therefore, in you package.json it automatically added the property "main": "expo-router/entry". so, in order to avoid the expo-router falling back, you must remove this line form you package.json and then your react-navigation setup should work with no problem!
You can just add the environment variable to the kernel.json file of the used kernel (virtual env):
,"env": {
"PYTHONPATH": "/path/to/folder:$PYTHONPATH"
}
The kernel.json could be found somewhere in the .local folder (~/.local/share/jupyter/kernels/your_venv/kernel.json)
If there somebody is dealing with a dataset which includes header{head -n 1 data.csv && tail -n +2 data.csv | sort -k1 -n -t,;} > result.csv
, written bash code neglects first row and sorts left rows in descending order by first column
I'm using an ubuntu 22.04 machine and currently facing the same issue.
Note: the docker container I'm attaching via VS Code contains Ubuntu 24.04 version.
Any solution for this ?
PS: Sorry for writing my query in the answer section. I'm unable to add a comment currently.
Did you check if UInt64(total - id) * 1_000_000_000
statement overflows?
If so you could try something like this: let (value, overflowOccurred) = interval.multipliedReportingOverflow(by: 1_000_000_000)
and then check overflowOccurred
value
I've tried try await Task.sleep(nanoseconds: Int64.max)
and it completes immediately. At the same time documentation doesn't state which max value could be passed to this method.
Unfortunately, you can't control color inversion in dark mode when using Outlook. Let's delve deeper into this issue:
https://www.litmus.com/blog/the-ultimate-guide-to-dark-mode-for-email-marketers
Use int sqlite3_column_count(sqlite3_stmt *pStmt); // To get number of columns const char sqlite3_column_name(sqlite3_stmt, int N); // To get names of columns
<img loading='lazy' src={require('../../assets/images/logo.svg').default} alt="logo" />
You can check the documentation: https://github.com/ng-select/ng-select?tab=readme-ov-file#custom-styles
Thx, it works i was lost but you save my life
witam Daniel plox help!!!!!!!!!!!!
A Separator
will appear vertical within a StatusBar
. If you are able to change your parent control from a <StackPanel Orientation="Horizontal"/>
to a <StatusBar/>
then they will be displayed vertically with no custom style needed.
For PostgresQL:
SELECT count(distinct array[DocumentId, DocumentSessionId]) FROM DocumentOutputItems;
Finally and thanks to @KenS I was able to have a solution to my problem.
I'm not able to get the latest master branch built for alpine linux. It would take too much time to find a solution to this - which I don't have.
So I took the latest release (10.04 for now), which I am able to build for alpine and patched in the following commit -> https://cgit.ghostscript.com/cgi-bin/cgit.cgi/ghostpdl.git/commit/?id=5d9b000c70f66f5be7bfaed1f16ef4e50d820dc6
The result is a fully valid PDF/A-3 document and also Adobe sees the attachment now.
So until 10.05 will be released in march this is fine for us.
Thank you ASH! Thank you, Thank you. You're the only one I've seen who parameterized it this way, and this is the ONLY way that has worked for me. And just to mention it, the Linking also works too just by changing LinkSource to True. :)
I've found the problem!
The problem was an incorrect package name in the sealed class file!
Like this:
Once I've fixed to the correct package the problem was gone.
перевожу сейчас проект на LTS версию, получил ошибку по грейдам, что делать не знаю, есть ли у вас мысли? CommandInvokationFailure: Gradle build failed. C:\Program Files\Unity\Hub\Editor\6000.0.34f1\Editor\Data\PlaybackEngines\AndroidPlayer\OpenJDK\bin\java.exe -classpath "C:\Program Files\Unity\Hub\Editor\6000.0.34f1\Editor\Data\PlaybackEngines\AndroidPlayer\Tools\gradle\lib\gradle-launcher-8.4.jar" org.gradle.launcher.GradleMain "-Dorg.gradle.jvmargs=-Xmx4096m" "bundleRelease"
Follow the steps:
Looks like there's no option for having two month pickers anymore since v1.0.0: https://github.com/hypeserver/react-date-range/blob/master/src/components/Calendar/index.js#L204
You can lock the old version in your package.json
if you want to. It'll look like that:
"react-date-range": "0.9.4"
The best way to do that is via the package manager:
npm uninstall react-date-range
npm install [email protected]
Same goes for yarn remove/yarn add
for other package managers, etc.
It's 8 years old and won't be updated, but if it doesn't contain serious bugs and works for you, then why not?
import os os.environ["OPENAI_API_KEY"] = "your-api-key-here"
this works for me other ways just throwing teh same error over and over
A third option on Linux (or on MacOS Intel if you install bindfs) in addition to the two in @Daniel t.'s answer is something I found in moby/moby#7198: use a host-side bind mount to rewrite the UID and GID, and then mount the target of that mount into your container.
For example:
sudo bindfs -u UID -g GID app app_chowned
docker run -d \
-it \
--name devtest \
-v "$(pwd)"/app_chowned:/app \
nginx:latest
All credit to jjrv on GitHub, the author of that comment.
As a third party developer we received a mail from [email protected] that our client (the bank issueing the card) should confirm that we work for them. Whe our client contact sent mail to [email protected] for that purpose they got notification that this mail account was used for form submits only.
So question being which email should confirmation of our developer relationship be sent to?
I just had the same issue and the solution is to use Auth Type "API Key"
Key: "Authorization"
Value: "Token {{TOKEN}}"
Add to: Header
Go to C/C++ configurations, and navigate to include paths Type in the path to Qt include e.g. C:\Qt\6.8.1\mingw_64\include**.
Do not forget to add the two Asterix at the end so that it searches for subdirectories
I have the same problem. Can you solve it?
The problem is related to Antivirus and Microsoft Defender. Add Android Studio related folders into excluion list of MS Defender. Check the build after adding every folder and do each one separately. Here is the possible list:
Make sure to replace with your actual Windows username and adjust drive letters as necessary for custom setups. Adding these folders will help avoid Defender interference during builds and improve Android Studio performance
#Sets are unordered, so when using the pop() method, you do not know which item that gets removed. Kind of random generation :)
Boys_set = set(Boy) Girls_set = set(Girl)
print(f'Before : {len(Boys_set)}')
def boyname():
result = Boys_set.pop()
print(result)
#print(f"Remaining Boys: {Boys_set}")
if result not in Boys_set:
print('Success!')
boyname() print(f'After : {len(Boys_set)}')
#To pop all Boys randomly length=len(Boys_set) for i in range(0,length): boyname() print(f'Remaining Boys: {len(Boys_set)}')
This should automate some widget's details dimension calculation:
from tkinter import *
import platform
root = Tk()
root.update()
screen_x = root.winfo_screenwidth()
screen_y = root.winfo_screenheight()
edge_x = root.winfo_rootx() - root.winfo_x()
edge_y = edge_x
title_y = root.winfo_rooty() - root.winfo_y() - edge_y
if platform.system() in ['Windows', 'Darwin']:
root.state('zoomed')
elif platform.system() in ['Linux', 'freebsd7']:
root.attributes('-zoomed', True)
root.update()
zoomed_screen_y = root.winfo_height()
task_bar_y = screen_y - zoomed_screen_y - title_y
print(task_bar_y)
root.destroy()
root = Tk()
root.geometry('{}x{}+0+0'.format(screen_x - 2 * edge_x, screen_y - title_y - 2 * edge_y - task_bar_y))
root.update()
Setting up an SSL certificate with Nginx for an R-Shiny app sounds great! It ensures secure connections and adds professionalism to your custom domain.
Firstly, the insert that you have used seems to be incorrect. Correct query: Insert into test values (‘{ “description”:”employee”, “criteria”: { “employee_id”:{ “in”:[10137,12137,19137] } } }’);
Next, the query to be used to retrieve the 2 ids,
select * from test where data ->’criteria’->’ employee_id’->’in’ ?& array[10137,12137];
Try the above and check if it works…
Struggling with M1 MacBook Pro Android Studio Installation Issue? Discover solutions for the SDK not downloading and get your development back on track.
Solutions:
Get a glimpse of the final version without installing it.
Get Java for your Mac and install it.
Get JDK for Java for your Mac and install it.
Get JRE for Java for your Mac and install it. Create a folder called "/Android/sdk" under Users/USER_NAME/Library. You may now install Android Studio.
To know more details, visit Macbook Repair Dubai
Instead of
var oRBGroup = this.getView().byId("rbg1");
var oButtonSelectedIndex = oRBGroup.getSelectedButtonIndex();
write
var oRBGroup = this.getView().byId("rbg1");
var oButtonSelectedIndex = oRBGroup.getSelectedIndex();
You don't need a filter for the event name, just use request.EventName = "SUSPEND_USER"
Ref: https://developers.google.com/admin-sdk/reports/v1/appendix/activity/admin-event-names
According this issue, interacting with Databases views thanks to Notion API was not supported back in 2021. And I don't see any news on that subject since then either in the Documentation nor on Github SDK.
I don't think it's possible to update terms of service in Google Play. The above answers confuse "terms of service" and "privacy policy" which is not the same thing.
It's possible in Apple developer console though.
Yes, eCommerce platforms are absolutely worth it for large-scale systems. Platforms like NextShopz are designed to handle the complexities of large-scale operations while simplifying management and enhancing efficiency.
Here’s why eCommerce platforms are invaluable for large systems:
For large-scale systems, eCommerce platforms like NextShopz provide the stability, flexibility, and tools needed to thrive in the competitive online marketplace.
Asking question in the answer,as I cannot comment!Let's say user1 and user2 are members of conv1 and if user1 deletes conv1 and we will update the deleted_by feild with user1 id.Now ,if user2 messages to conv1,do we have to make the deleted_by field to empty since user2 started the conv1 again .What do we have to do here ?
Hdvegdubdhdh dhbdgd hdbebh bhebe egsg hai rvey hai evhd nahi vetan vwvdg g gtth
i had a problem surrounding a similar issue as i was updating a project running on linux docker from 6 to 8 and run in the openssl issue, i needed to modify my image to work with lower tls versions:
RUN sed -i '/\[openssl_init\]/a ssl_conf = ssl_configuration' /etc/ssl/openssl.cnf && \
echo "\n[ssl_configuration]" >> /etc/ssl/openssl.cnf && \
echo "system_default = tls_system_default" >> /etc/ssl/openssl.cnf && \
echo "\n[tls_system_default]" >> /etc/ssl/openssl.cnf && \
echo "MinProtocol = TLSv1" >> /etc/ssl/openssl.cnf && \
echo "CipherString = DEFAULT@SECLEVEL=0" >> /etc/ssl/openssl.cnf
maybe this helps you in any way
As documented here I can use 'foreignKey' instead of ->setForeignKey.
If I use ->setForeignKey it will invoke the frameworks conventions which means that if I want to associate with the contact archives table the belongsTo must be declared like:
$this->belongsTo('ContactArchives', [
'setForeignKey' => 'contact_archive_id'
]);
But I only have contact_id in the contact archives table which means I must bypass the conventions and use the below to access the active and archives contacts table:
Active Contacts Table
$this->belongsTo('Contacts', [
'setForeignKey' => 'contact_id'
]);
Archive Contacts Table
$this->belongsTo('ContactArchives', [
'foreignKey' => 'contact_id'
]);
You are running into a Python environment conflict because LibreOffice Calc uses its own embedded Python interpreter, which can be tricky to work around when trying to also use a global Python interpreter Attempt using IPC ( Inter-Process Communication ). You can separate the two parts entirely by using IPC methods to communicate between the LibreOffice Python and your global Python.
SORTED MULTIMAP WITH PUBLIC NON-PARAMETRIC MAPPINGS The multimap abstract data type, a container of key-value associations where
dublicate keys are allowed, is defined with the following interface
public interface SortedMultimapek extends Comparable<K,5 [ /**Returns a value to which the specifted key is napped.
@param key the key whose associated value is to be returned.
@return a value to which the specifled key is mapped, or null if this sorted multimap contains no mapping for the key.
*V find(K key):
/**Returns true if this sorted multimap contains no key-value mappings.
*/ boolean LsEmpty();
@return true if this sorted multimap contains no key-value mappings.
/**Associates the specified value with the specified key in this sorted multimap. @param key the key with which the specifled value is to be associated.
@param value the value to be associated with the specified key. @throw java.lang. IllegalArgumentException if the specified key or value is null.
*/ void insert(K key, V value);
/**Removes a mapping for a key from this sorted multimap if it is present. @param key the key whose mapping is to be removed from this sorted multimap. @return the previous value associated with key, or null if there was no mapping for key.
*/V remove(K key);
/**Returns the number of key-value mappings in this sorted multimap.
*/ int size();
@return the number of key-value mappings in this sorted multimap.
/**Returns a sorted array view of the keys contained in this sorted multimap. @return an array view of the sorted keys contained in this sorted multimap where keys are sorted in ascending order according to their natural order.
*/ Object[] sortedKeys();
where K and V are the parametric data types for the key and value. The class to be used for the mappings is the following public class
public class Entry {
// instance variables
/**Constructs a new mapping with the specified key and value.
private Object key, value;
@param k the specified key of this mapping.
@param v the specified value of this mapping.
*/ public Entry(Object k, Object v) { setKey(k); setValue(v); }
/**Returns the key of this mapping.
@return the key of this mapping.
*/ public Object getKey() { return key; }
/**Returns the value of this mapping.
@return the value of this mapping.
*/ public Object getValue() { return value; }
Sets the key of this mapping with the specified key. /
@param k the specified key.
*/ public void setKey(Object k) (key = k; }
To compile an ursina app, you can use auto-py-to-EXE and select all the modules that you are using.
I just changed my VM device and switched to Android 14 (upsidedowncake) api level 34 and now it's working.
Unfortunately, you can't control color inversion (not at all) in dark mode when using Gmail. Let's delve deeper into this issue:
https://www.litmus.com/blog/the-ultimate-guide-to-dark-mode-for-email-marketers
Opacity level 0 means, there is no component available to accept the click event. You can try out with the value 0.01f instead of 0. This value will make the window almost invisible, still solving your purpose as well.
this.setOpacity(0.01f);
The value 0.001 will be too small and will not work.
You misunderstood the answer from the first link, this answer suggests as a solution replacing the model in the Meta
class as in the code below. Also make sure you have set AUTH_USER_MODEL='custom_user_app.CustomUserModel'
in the settings file.
# You need to replace the direct import of your model with the one shown below
# from .models import User
from django.contrib.auth import get_user_model
User = get_user_model()
class RegisterSerializer(serializers.ModelSerializer):
password = serializers.CharField(
style={'input_type': 'password'},
write_only=True,
)
class Meta:
model = User
fields = ['email', 'password', 'password2']
...
Also try the same trick in your tests. This should fix your initial error, but there are still problems in your code, for example, the create
serializer method, replace the pasword
keyword with password
. I hope you are now clearer on what you should try to do, you should now check it yourself and let me know if your problem has been fixed?
Thanks, with your answer i resove the problem, now it's working
I think there's session_start();
missing in your code.
Otherwise, you should try to do a print_r() to see if you missed something there. (or a var_dump() )
Easilly:
While loading page, add js request to serverside, which will give a client public ip.
On server side, in your backend script lang exec cli command ping client.ip.add.ress -c 1
then parse it relatively on your OS and your backend scripting lang, and return value of msec result to your web page.
On webpage, based on your js code, print readable result to user on webpage.
It can be repeated time after time, cause network can be changed, time to time, and it's will be pritty nice to constantly refresh value. And don't forget to catch any exceptions, like ping forbid or firewall or ping packet loss.
i hope you also doing well. Cheers!
This example is wrong. Inside a procedural block like always block you cant use non-blocking assignment operator i.e. <=. You have to always use blocking assignment operator i.e. =
Assalamu alaykum!
I have implementation in Kotlin that worked perfectly for my case:
import androidx.compose.ui.geometry.Offset
import kotlin.math.abs
import kotlin.math.sqrt
object GetBoundaryBorder {
fun getPolygonsHull(polygons: List<List<Offset>>, n: Int = 4): List<Offset> {
val allPoints = polygons.flatten()
val initialHull = findConvexHull(allPoints)
return findPolygonHull(initialHull, n)
}
fun getPolygonHull(polygons: List<Offset>, n: Int = 4): List<Offset> {
val initialHull = findConvexHull(polygons)
return findPolygonHull(initialHull, n)
}
private fun findConvexHull(points: List<Offset>): List<Offset> {
if (points.size < 3) return points
val start = points.minWith(compareBy({ it.y }, { it.x }))
val sortedPoints = points.filter { it != start }.sortedWith(compareBy(
{ point ->
val dx = point.x - start.x
val dy = point.y - start.y
Math.atan2(dy.toDouble(), dx.toDouble())
},
{ point ->
val dx = point.x - start.x
val dy = point.y - start.y
dx * dx + dy * dy
}
))
val stack = mutableListOf(start)
for (point in sortedPoints) {
while (stack.size >= 2 && !isCounterClockwise(
stack[stack.size - 2],
stack[stack.size - 1],
point
)
) {
stack.removeAt(stack.size - 1)
}
stack.add(point)
}
return stack
}
private fun isCounterClockwise(p1: Offset, p2: Offset, p3: Offset): Boolean {
val crossProduct = (p2.x - p1.x) * (p3.y - p1.y) - (p2.y - p1.y) * (p3.x - p1.x)
return crossProduct > 0
}
private fun findPolygonHull(hullOffsets: List<Offset>, numberOfSides: Int = 4): List<Offset> {
// Handle special cases
when (hullOffsets.size) {
0, 1 -> return hullOffsets
2 -> return expandLineToQuad(hullOffsets[0], hullOffsets[1])
3 -> return expandTriangleToQuad(hullOffsets)
}
var currentHull = hullOffsets
while (currentHull.size > numberOfSides) {
var bestCandidate: Pair<List<Offset>, Double>? = null
for (edgeIdx1 in currentHull.indices) {
val edgeIdx2 = (edgeIdx1 + 1) % currentHull.size
val adjIdx1 = (edgeIdx1 - 1 + currentHull.size) % currentHull.size
val adjIdx2 = (edgeIdx1 + 2) % currentHull.size
val edgePt1 = currentHull[edgeIdx1]
val edgePt2 = currentHull[edgeIdx2]
val adjPt1 = currentHull[adjIdx1]
val adjPt2 = currentHull[adjIdx2]
val intersection = lineIntersectionBorder(adjPt1, edgePt1, edgePt2, adjPt2) ?: continue
val area = triangleAreaBorder(edgePt1, intersection, edgePt2)
if (bestCandidate != null && bestCandidate.second < area) continue
val betterHull = currentHull.toMutableList()
betterHull[edgeIdx1] = intersection
betterHull.removeAt(edgeIdx2)
bestCandidate = Pair(betterHull, area)
}
bestCandidate?.let {
currentHull = it.first.toMutableList()
} ?: break // If we can't find a valid candidate, break instead of throwing exception
}
return currentHull
}
private fun expandTriangleToQuad(triangle: List<Offset>): List<Offset> {
// Find the longest edge
val edges = listOf(
Pair(0, 1),
Pair(1, 2),
Pair(2, 0)
)
// Calculate edge lengths and find longest
val edgeLengths = edges.map { (i, j) ->
Triple(i, j, distance(triangle[i], triangle[j]))
}
val longestEdge = edgeLengths.maxBy { it.third }
// Find the point not on the longest edge
val oppositePointIndex = (0..2).first { it != longestEdge.first && it != longestEdge.second }
val oppositePoint = triangle[oppositePointIndex]
// Get points of the longest edge
val edgePoint1 = triangle[longestEdge.first]
val edgePoint2 = triangle[longestEdge.second]
// Calculate vector from longest edge to opposite point
val edgeVector = Offset(
edgePoint2.x - edgePoint1.x,
edgePoint2.y - edgePoint1.y
)
// Calculate midpoint of longest edge
val midpoint = Offset(
(edgePoint1.x + edgePoint2.x) / 2,
(edgePoint1.y + edgePoint2.y) / 2
)
// Calculate vector from midpoint to opposite point
val toOppositeVector = Offset(
oppositePoint.x - midpoint.x,
oppositePoint.y - midpoint.y
)
// Calculate the length of this vector
val oppositeLength = sqrt(toOppositeVector.x * toOppositeVector.x + toOppositeVector.y * toOppositeVector.y)
// Create a point on the opposite side with the same distance
val fourthPoint = Offset(
midpoint.x - (toOppositeVector.x / oppositeLength) * oppositeLength * 1.5f,
midpoint.y - (toOppositeVector.y / oppositeLength) * oppositeLength * 1.5f
)
// Create result in correct order
return sortPointsClockwise(
listOf(
edgePoint1,
edgePoint2,
oppositePoint,
fourthPoint
)
)
}
private fun distance(p1: Offset, p2: Offset): Float {
val dx = p2.x - p1.x
val dy = p2.y - p1.y
return sqrt(dx * dx + dy * dy)
}
private fun sortPointsClockwise(points: List<Offset>): List<Offset> {
// Calculate centroid
val centroid = Offset(
points.sumOf { it.x.toDouble() }.toFloat() / points.size,
points.sumOf { it.y.toDouble() }.toFloat() / points.size
)
// Sort points by their angle from centroid
return points.sortedBy { point ->
-kotlin.math.atan2(
(point.y - centroid.y).toDouble(),
(point.x - centroid.x).toDouble()
)
}
}
private fun expandLineToQuad(p1: Offset, p2: Offset): List<Offset> {
val dx = p2.x - p1.x
val dy = p2.y - p1.y
val length = sqrt(dx * dx + dy * dy)
val perpX = -dy / length * 10 // Perpendicular vector, scaled
val perpY = dx / length * 10
return listOf(
p1,
Offset(p1.x + perpX, p1.y + perpY),
p2,
Offset(p2.x + perpX, p2.y + perpY)
)
}
private fun lineIntersectionBorder(p1: Offset, p2: Offset, p3: Offset, p4: Offset): Offset? {
val denom = (p1.x - p2.x) * (p3.y - p4.y) - (p1.y - p2.y) * (p3.x - p4.x)
if (denom.toDouble() == 0.0) return null
val x = ((p1.x * p2.y - p1.y * p2.x) * (p3.x - p4.x) - (p1.x - p2.x) * (p3.x * p4.y - p3.y * p4.x)) / denom
val y = ((p1.x * p2.y - p1.y * p2.x) * (p3.y - p4.y) - (p1.y - p2.y) * (p3.x * p4.y - p3.y * p4.x)) / denom
return Offset(x, y)
}
private fun triangleAreaBorder(p1: Offset, p2: Offset, p3: Offset): Double {
return abs((p1.x * (p2.y - p3.y) + p2.x * (p3.y - p1.y) + p3.x * (p1.y - p2.y)) / 2.0)
}}
The best case occurs when the index chosen randomly on the first iteration corresponds to an index containing a 1.
The probability of success in the first iteration is: P(success on first try) = Number of elements equal to 1 / Total number of elements = (n/2) / n = 1/2.
The number of iterations in this case: 1.
Complexity: O(1).
In the average case, the number of trials required to find a position containing a 1 follows a geometric distribution. This is because each attempt is an independent trial with a success probability of 1/2.
The expected number of trials for a geometric distribution is: E(number of trials) = 1 / P(success) = 1 / (1/2) = 2.
On average, the algorithm requires 2 iterations to find a 1.
Average complexity: O(1).
The worst case happens when the algorithm repeatedly selects indices containing 0 before finally finding a 1. Theoretically, it could take an arbitrarily large number of iterations before success, as the process is probabilistic.
The probability of failing k-1 times and succeeding on the k-th trial is: P(success on k-th trial) = (1 - 1/2)^(k-1) * (1/2).
While the worst-case scenario is extremely rare, the number of iterations can grow arbitrarily large.
Worst-case complexity: unbounded (theoretically infinite).
Summary of Performances
Case | Number of Iterations | Complexity |
---|---|---|
Best Case | 1 | O(1) |
Average Case | 2 | O(1) |
Worst Case | Theoretically infinite | Unbounded |
Yes, use the Automation Framework instead of the packaged API scan. It is much more flexible. It supports technology configuration. If you have more detailed questions then the ZAP User Group is a bteer place to ask questions.
throwing SessionNotCreatedException
This is because you are using the same user for different threads. use different users for parallel testing
try this and confirm
Today, in 2025, I would definitely go for ChilliCream, HotChocolate and Fusion. These are mature libraries and tools.
Perhaps you are storing this object from some other object and it's saved by reference rather than a copy/clone, try following ways and then see:
input=JSON.parse(JSON.stringify(someObj))
or
input={...someObj}
Oddly enough, I realized that Artifact Registry was properly configured locally. I had to add gcloud auth configure-docker <region>-docker.pkg.dev
to specify the region configuration
I have the same problem, I fixed it with installing maven.
I hope i dont miss something, and this will help on next person that read.
I would say that the best answer I know about right now is to consider Penlight
as a Lua’s standard library and use https://lunarmodules.github.io/Penlight/libraries/pl.stringx.html#split
I am not writing the cleaner code for you. Why not? Because you will learn more from writing it yourself after you have read the suggestions. (Moderators and some users don’t like the purpose of leaving out code being stated. So this piece of information risks getting deleted.)
My suggestions are:
Declare constants:
Extract constants, for example RANDOM_GENERATOR
, NUMBER_OF_ROWS
, NUMBER_OF_COLUMNS
, HEADER_LETTERS
and BOUND_EXCLUSIVE = 2
(the number that you pass to ran.nextInt()
.
Methods: As @Anonymous says in a comment, divide into methods. Have one method for generating the random number, one for constructing the 2D array and one for printing it. When I print tables in plain text I also often write a method for printing one row that I use for printing the header row and for each content row; but I don’t think it is worth it here.
Loops or streams:
As others have said, there are good ways to avoid the 12 lines with repeated calls to nextInt()
and the variables coming out them for which no 12 good names exist. For me I would start with the idea from @DuncG, use Arrays.setAll()
for filling each row. It combines fine with a method for generating the random numbers as I suggested. You also need a loop or stream operation to generate all the rows.
In the cases where the array already exists prefer an enhanced for loop over a loop with an index:
for (int[] row : array) {
// Process inner array
}
String joiner: Since the advent of string joiners I have found them elegant for generating a printed row with spaces between the elements and without a space at the end.
For the header letters: String.join(" ", letter)
For each content row:
Arrays.stream(row)
.mapToObj(Integer::toString)
.collect(Collectors.joining(" "));
In most cases, separate validation functions for each field type are the better choice. This approach aligns with the Single Responsibility Principle (SRP) and makes your code more modular, readable, and maintainable. It also simplifies testing and debugging, as each function is responsible for a single task.
However, if your validation logic is very simple and unlikely to change, a single function with a switch or if-else structure might suffice. Just be cautious about maintaining clarity and avoiding overly complex logic.
Updating description from AWS console will do the job. My use case was, I was storing SSM parameter in static variable in lambda function as a cache. I updated the value of ssm parameter and updated the description of lambda from AWS console, the updated value reflected in lambda.
I have a k8s master and worker node setup, trying to pull from local docker registry from my worker node and got into same issue. I tried almost all the resolving scenarios but nothing worked for me expect this Insecure_registries_pull_from_local_docker. my env is ubuntu 22 then config must do it on both end(master and worker) to work.
You can simply simply squash-and-merge instead.
- name: Enable auto-merge for Dependabot PRs
if: contains(github.event.pull_request.labels.*.name, 'automerge')
run: gh pr merge --squash --auto "$PR_URL"
env:
PR_URL: ${{github.event.pull_request.html_url}}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
I had the same issue. Fixed it by downgrading to ubuntu 22.04 instead of using ubuntu-latest, which has been recently upgraded to 24.04 as per this. Try changing runs-on: ubuntu-latest
to runs-on: ubuntu-22.04
in ci.yml
If apache common lang has been added to your project, use:
org.apache.commons.lang3.StringUtils.equalsIgnoreCase()
I recommend not mixing threads with asyncio, I wasted a day trying to replicate all examples here and still got the error "cannot schedule new futures after interpreter shutdown". It seems the asyncio loop is getting shut down by its thread at some point in my case, maybe while switching between threads, and the lib I am using has many async functions in its internals.
I only got rid of this error by removing asyncio from threads. I think it's a valid warning for those who may experience a similar issue.
To fix the issue, update your Aptfile to replace libasound2 and libasound2-dev with libasound2t64
If these are 2 different repo's then it's not possible to call the APIs implemented in Playwright directly within your WebdriverIO tests. You'll need to duplicate or rewrite them for WebdriverIO.
If you use the search terms "Basic Text Area Styling in Spotfire" you will find a You Tube video that should help you. It is from 2020 hopefully up to date enough for your needs.
When you test the changes you applied, are you trying it in the server's local or are you checking it from a different environment? Because if you try this in the server, it will already show you the error because it considers you safe.
TextField( controller: _controller, decoration: InputDecoration( labelText: 'Enter Amount', border: OutlineInputBorder(), suffixText: ' USD', // Currency symbol as suffix suffixStyle: TextStyle(fontSize: 16, fontWeight: FontWeight.bold), ), keyboardType: TextInputType.numberWithOptions(decimal: true), ),
It looks like you're trying to order the results by the position field within a nested relationship, but the orderBy isn't working as expected. In your query, you're using the with method to eager load the catproduct and rproduct relationships, and then applying the orderBy inside the closure for the rproduct relationship.
However, the issue could arise from the way the query is structured. Here's how you can modify the code to ensure it works properly:
$pr_data = Category::with(['catproduct.rproduct' => function($query) {
$query->where('status', '1')->orderBy('position', 'asc');
}])
->where('id', $categoryDetials->id)
->get();
This code should work if the relationships are set up correctly, but if it's still not working, there are a few things to check:
Category model should have a relationship like:
public function catproduct()
{
return $this->hasMany(CatProduct::class);
}
public function rproduct()
{
return $this->hasMany(RProduct::class);
}
Check the position Field Ensure that the position field exists in the rproduct table and is an integer (or another comparable type).
Use orderBy on the Parent Level If you want to order the Category results as well, you can apply orderBy to the main query:
$pr_data = Category::with(['catproduct.rproduct' => function($query) { $query->where('status', '1')->orderBy('position', 'asc'); }]) ->where('id', $categoryDetials->id) ->orderBy('id', 'asc') // Example: Ordering categories by their ID ->get();
I encountered a similar issue where I couldn't find the pages_read_engagement and other page-related permissions.
To resolve this, I created a completely new app and selected the App Type: "Business". This allowed me to access all the necessary permissions. Please note that you should only select the Consumer type if you need basic login permissions and advertisement permissions
Frankly speaking. People who try to crack others' property has very low level of educated in the IT field. Just do netstat -an. Why scan the whole range from 1 to 65k? Why do you say a normal app aren't allowed to be a server app, and opening a port? What port is allow and what port not? Very low level of educated knowledge in IT
This is actually a common issue when working with validation annotations in Spring Boot. The problem here is that both @NotBlank and @Size(min = 2) are getting triggered, even when you send an empty string. Here’s why:
@NotBlank checks if the string is not empty (or just whitespace). @Size(min = 2) checks the length of the string to make sure it has at least 2 characters. When you send an empty string, @NotBlank will fail first because the field is empty. But Spring’s validation process doesn’t stop there — it will still evaluate other constraints, like @Size(min = 2), and give you both error messages.
In terms of validation flow, Spring doesn’t guarantee that @NotBlank will always run first, and both annotations are usually evaluated together. So in this case, you’re seeing both errors even though you expected @NotBlank to take precedence.
My BGMI account not login to twitter account please fix this bugs and glitch. Thanks
The following steps worked for me:
run flutter clean close project open android folder with Android Studio sync with gradle close Android Studio open your project run flutter pub get
Finally, I changed my method of working with my powerpoint. I use the method described in the python library :
if name == "main": # Charger le template template_path = 'path/to/template.pptx' with open(template_path, 'rb') as f: source_stream = BytesIO(f.read()) prs = Presentation(source_stream)
# Exemple de données
data = {
'nom_projet': 'Projet X',
'date': '2023-10-01',
# Ajoutez d'autres paires clé-valeur selon vos besoins
}
generate_template(prs, data)
fill_in_doc(prs, data)
# Sauvegarder la présentation
target_stream = BytesIO()
prs.save(target_stream)
with open('path/to/output.pptx', 'wb') as f:
f.write(target_stream.getvalue())
JsonExecutionRequest is not a dict object.
I have this exact same issue / question. The 'Startsession' middleware resets the session and therefore resets the set flash messages (it seems).
Really hope someone found out how to get both 'normal' sessions and flash-messages to work properly in Laravel 11.
Thanks to @PySir I got it working!
You were right. You cannot "Upload & Publish" in one step, the video needs to be first uploaded and then you can 'create a video object' on your channel, then publish it. After uploading a video you can then create the object and publish in one step and thats where I was getting confused as the docs say to 'upload & publish'
Anyways, Here's what I used:
app.post('/api/dailymotion/upload', async (req, res) => {
const localVideoDirectory = path.join(__dirname, 'videos');
try {
const videos = fs.readdirSync(localVideoDirectory)
.filter(file => file.endsWith('.mp4') || file.endsWith('.mkv') || file.endsWith('.avi'))
.map(file => ({
name: file,
path: path.join(localVideoDirectory, file),
normalizedName: file.replace(/\.[^/.]+$/, '').toLowerCase(),
}));
if (!videos.length) {
return res.status(400).send('No video files found in the local directory.');
}
console.log('[📂] Found Local Videos:', videos);
const token = await getAccessToken();
const existingVideosResponse = await axios.get(`${DAILY_API_BASE}/me/videos`, {
params: { fields: 'title', access_token: token },
});
const existingVideoTitles = existingVideosResponse.data.list.map(video =>
video.title.toLowerCase().trim()
);
console.log('[📂] Existing Dailymotion Videos:', existingVideoTitles);
const videosToUpload = videos.filter(video =>
!existingVideoTitles.includes(video.normalizedName)
);
console.log('[📂] Videos to Upload:', videosToUpload);
if (!videosToUpload.length) {
return res.status(200).send('All videos are already uploaded.\n');
}
const uploadResults = await Promise.all(videosToUpload.map(async (video) => {
try {
console.log(`[📂] Preparing to upload video: ${video.name}`);
const uploadUrlResponse = await axios.get(`${DAILY_API_BASE}/file/upload`, {
params: { access_token: token },
});
const uploadUrl = uploadUrlResponse.data.upload_url;
console.log('[🌐] Upload URL:', uploadUrl);
const videoData = fs.readFileSync(video.path);
const form = new FormData();
form.append('file', videoData, video.name);
const uploadResponse = await axios.post(uploadUrl, form, {
headers: { ...form.getHeaders() },
maxContentLength: Infinity,
maxBodyLength: Infinity,
});
console.log('[✅] Video Uploaded:', uploadResponse.data);
const videoUrl = uploadResponse.data.url;
const videoName = uploadResponse.data.name;
console.warn('[🌐] Video URL for Publishing:', videoUrl);
console.warn('[🌐] Video Name for Publishing:', videoName);
const publishResponse = await axios.post(
`${DAILY_API_BASE}/me/videos`,
{
url: videoUrl,
title: videoName,
channel: 'music',
published: 'true',
is_created_for_kids: 'false',
},
{
headers: {
Authorization: `Bearer ${token}`,
'Content-Type': 'application/x-www-form-urlencoded',
},
}
);
console.log('[✅] Video Published:', publishResponse.data);
return publishResponse.data;
} catch (error) {
console.error(`[❌] Error processing video (${video.name}):`, error.response?.data || error.message);
return { error: error.message, video: video.name };
}
}));
res.json(uploadResults);
} catch (error) {
console.error('[❌] Error in upload endpoint:', error.message);
res.status(500).send('Error uploading videos.');
}
});
I need to refactor some things now, making it more comprehensive and fault tolerant and point it back to the cloud instance for finding files automatically and finish some styling. Other than that the core logic is there now and I am able to scan for and upload videos programatically! Thank you very much for your patience @PySir
Use negative numbers for the rows before the current row in rowsBetween. It's NULL because there are no rows between start=5 end end=0
window = Window.orderBy("new_datetime").rowsBetween(-5, Window.currentRow)
I found that your .env file contains DATABASE_URL which will be used for makefile to do migration up (https://github.com/bantawa04/go-fiber-boilerplate/blob/016a2b45c06882aea0d1efb8123e0cecdac427e2/.env#L11) i believe that your migration up is failed because your database host is database in .env file which is not pointing to anywhere. to make your code works your need to update DATABASE_URL from
postgresql://${DB_USERNAME}:${DB_PASSWORD}@database:5432/${DB_NAME}?sslmode=${DB_SSL_MODE}
to
postgresql://${DB_USERNAME}:${DB_PASSWORD}@{DB_HOST}:5432/${DB_NAME}?sslmode=${DB_SSL_MODE}
DB_HOST can be anything like localhost, ip address or your database's host in other server.
I'm just updating an app from v3 primevue to v4 It looks like font is no longer defined as a default
From https://primevue.org/theming/styled/
There is no design for fonts as UI components inherit their font settings from the application.
So, it seems correct that we should now have to define the font-family on the body tag instead.
There is no default font defined
I ended up doing as @MartinDotNet suggested — creating dummy root span at the beginning, and using it as a parent for the future spans. The main point that I was missing is that children can be attached to the parent span even after parent is already closed.
In 2025, Microsoft have thier NPM Package Link is https://www.npmjs.com/package/@microsoft/clarity
If you are using a ROOM database, then check your entities.
I was using a predefined database created with DB Browser (SQLite),and I had camel cased a couple of fields, and this was causing the problem.
Try running the app normally and check your logcat.
it's working for me with below version pip install google-cloud-secret-manager==2.10.0
from google.cloud import secretmanager
The answer is complex, dealing with multiple different kinds of constructors.
Let's break it down:
books *B = new books [num];
This actually calls the default constructor num
times. Try putting a cout
statement in the default constructor and you will see it being created.
B[i] = books (author, title, price, publisher, stock);
This is called Copy-Assignment constructor. Again, if you add the copy assignment constructor, book& operator=(const book& other)
and add a cout
, it will appear. Because you are assigning to a value that already exists, the old value is getting destructed (the value you instantiated when you create the original books
array) and being updated with the new value in the for loop.
what could I do so that the destructor is called only at the end of main()
There are ways to do this with C++ pointers, such as having an array of pointers. Something like this:
Foo** foo = new Foo*[num];
for (int i = 0; i < num; i++)
{
foo[i] = new Foo();
}
for (int i = 0; i < num; i++)
{
delete foo[i];
}
delete[] foo;
But this is rather complicated. There are many solutions, but I think the simplest solution in your case is just to update the instance rather than create a new object, i.e. B[i].author = author
Secondly, if, instead of an explicit destructor, I make a function like destroybooks() below and call it through a for loop, do I then also need to have a statement like delete []B; at the very end?
Yes. By calling, new
with books* b = new books[num];
you are adding data to the heap, which must explicitly released. Otherwise, the memory can be leaked. I would suggest looking up the new
keyword. Some resources below to get your started:
https://en.cppreference.com/w/cpp/language/new
https://www.geeksforgeeks.org/new-and-delete-operators-in-cpp-for-dynamic-memory/