I came across post in which talking about C++, to have safety features as in Rust. You can find post here: link.(in post saying about, what committee want to add, with proposals beginning from introducing safety profiles to implementing a borrow checker similar to Rust)
Currently, C++ making security standart is depends not only developers itself of C++, but mostly programmers, who uses safety features in C++ language, such as smart pointers: std::unique_ptr, std::shared_ptr, std::weak_ptr.
C++26 is still under development as well as C++29. Approximately C++26 will be out in 2026 year.
I’m thinking of using a Pub/Sub topic. There are third-party SMS gateway services that allows you to configure a webhook that triggers when an SMS is received. This webhook can then send the data to a Pub/Sub topic. Then create a Cloud Run service that subscribes to the Pub/Sub topic. This service will receive the SMS data and then insert it into a BigQuery table. Let me know if this works.
console.log(Hello world) function greet() number=>{court in out => () Hello world=>srting
color:bulesky background: Green;ailyn border: center; background size: cover;
input class="email
input your class="password "id="email password "id="i don't have account" input type="email password " <body><html>
I tried making a few changes to the code and added in a few debugging triggers. I think I at least narrowed down where the issue is.
// wrap the string in double-quotes, so it will work with the Indirect function
object refCellObj = Excel(xlfIndirect, $"\"{strCell}\""); strDebug = $"\"{strCell}\"";
strDebug = "";
// DIAGNOSTIC LOGGING
strDebug = $"[DEBUG] INDIRECT({strCell}) → {refCellObj?.GetType().Name}";
For the case where strCell = D27 (same sheet), everything works fine.
For the case where strCell = Sheet2!D27, the value of refCellObj = ExcelError, becuase xlfIndirect appears to be choking on strCell.
Any idea how I should format the string (strCell) that gets fed into the xlfIndirect function?
i think this can be solved by filling the missing values first right after loading the data
data['variable'].fillna('No data')
and turn them into categorical
data['variable_q'] = pd.Categorical(data['variable'])
So this has worked for me as well! But can anyone help me with the C# decryption logic? Below, I'm sharing the Java decryption code and the equivalent C# code, too. The C# code throws the error "MAC check failed. Data may be tampered.", indicating that calcTag.SequenceEqual(tag) is failing.
I'm also sharing the Java encryption logic here.
The Java Encryption Code:
package encrypt;
import java.nio.charset.StandardCharsets;
import java.security.InvalidAlgorithmParameterException;
import java.security.InvalidKeyException;
import java.security.KeyFactory;
import java.security.NoSuchAlgorithmException;
import java.security.NoSuchProviderException;
import java.security.PrivateKey;
import java.security.PublicKey;
import java.security.Security;
import java.security.Signature;
import java.security.SignatureException;
import java.security.spec.InvalidKeySpecException;
import java.security.spec.PKCS8EncodedKeySpec;
import java.security.spec.X509EncodedKeySpec;
import java.time.Instant;
import java.util.Base64;
import javax.crypto.Cipher;
import org.bouncycastle.crypto.digests.Blake2bDigest;
import org.bouncycastle.jce.provider.BouncyCastleProvider;
import org.bouncycastle.jce.spec.IESParameterSpec;
public class Encrypt {
public static void main(String[] args) throws Exception {
// plain-text request
String input = "thIS IS MANISH";
String subid = "SO29";
String aiId = "SA29";
String currentTime = String.valueOf(Instant.now().toEpochMilli());
String expiryTime = String.valueOf(Instant.now().toEpochMilli() +(Integer.parseInt("180000")));
String signature = signPayload(input, currentTime, expiryTime);
String protectedInfo = Base64.getEncoder().encodeToString(getProtectedDetails(subid,aiId, currentTime, expiryTime).getBytes());
String encryptedpayload = encryptECIES(input);
String finalPayload = "{\"payload\":\"" + encryptedpayload + "\",\"signatures\":[{\"signature\":\"" + signature +
"\",\"protectedInfo\":\"" + protectedInfo + "\"}]}";
System.out.println(finalPayload);
}
private static String getProtectedDetails(String subscriberId, String keyID, String currentTime, String expiryTime) {
return "keyId=\"" + subscriberId + "|" + keyID + "|" + "ecdsa" + "\",algorithm=\"ecdsa\",created=\"" +
currentTime +
"\",expires=\"" + expiryTime +
"\",headers=\" (created)(expires)digest\"";
}
public static String blake2b512(String input){
System.out.println(input);
// Create Blake2b digest
Blake2bDigest blakeHash = new Blake2bDigest(512);
blakeHash.update(input.getBytes(), 0, input.getBytes().length);
byte[] hashByte = new byte[blakeHash.getDigestSize()];
blakeHash.doFinal(hashByte, 0);
System.out.println("Blake2b-512 Hash: " );
String encodedString = Base64.getEncoder().encodeToString(hashByte);
System.out.println(encodedString);
return encodedString;
}
public static String signPayload(String input, String currentTime, String expiryTime) throws NoSuchAlgorithmException, SignatureException, NoSuchProviderException, InvalidAlgorithmParameterException, InvalidKeySpecException, InvalidKeyException {
String concatenated = "(created):" + currentTime + "\n"
+ "(expires):" + expiryTime + "\n" + "digest:BLAKE2b-512="
+ blake2b512(input);
System.out.println(concatenated);
// partner's private key
//String encodedPrivateKey = "TUlHSEFnRUFNQk1HQnlxR1NNNDlBZ0VHQ0NxR1NNNDlBd0VIQkcwd2F3SUJBUVFnRlNsV0UwaS8wWC9iYVE0N1lMUUNST1BSTFFnMXRib1F4L2xQeW4xUUZwT2hSQU5DQUFRVHpqMUsvMmc5blZTU3hTR0o4VnpzWU90Ui9rVFN5WG02dXNGSUJNbHEzcHQ1Nzh6cjlCZkY0aTIzOGRodzhWQ05obi9jQmZwQTJoNGJFK3JoTkJ3NA==";
String encodedPrivateKey = "MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQgFSlWE0i/0X/baQ47YLQCROPRLQg1tboQx/lPyn1QFpOhRANCAAQTzj1K/2g9nVSSxSGJ8VzsYOtR/kTSyXm6usFIBMlq3pt578zr9BfF4i238dhw8VCNhn/cBfpA2h4bE+rhNBw4";
KeyFactory keyFactory = KeyFactory.getInstance("EC");
PKCS8EncodedKeySpec privateKeySpec = new PKCS8EncodedKeySpec(Base64.getDecoder().decode(encodedPrivateKey));
PrivateKey privateKey = keyFactory.generatePrivate(privateKeySpec);
// Original message to be signed
// Create a signature instance using SHA-512 with ECDSA
Signature ecdsaSignature = Signature.getInstance("SHA512withECDSA");
// Initialize the signature instance with the private key for signing
ecdsaSignature.initSign(privateKey);
// Supply the data to be signed
byte[] data = concatenated.getBytes(StandardCharsets.UTF_8);
ecdsaSignature.update(data);
// Generate the digital signature
byte[] digitalSignature = ecdsaSignature.sign();
// Convert the digital signature to Base64 for easy display and transmission
String base64Signature = Base64.getEncoder().encodeToString(digitalSignature);
System.out.println("Digital Signature (Base64): " + base64Signature);
return base64Signature;
}
public static String encryptECIES(String payLoad) throws Exception {
Security.addProvider(new BouncyCastleProvider());
//npci's public key
String key = "-----BEGIN PUBLIC KEY-----\r\n"
+ "MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEEa2MrinHuivU0kE3hyJxxgoq96/N\r\n"
+ "DiTw8KxI6A+WXSStWrKUPwLYHdzKw5Z314ry6D9lpkMZflTP0BeCIZRwuw==\r\n"
+ "-----END PUBLIC KEY-----\r\n"
+ "";
String base64PublicKey = key
.replace("-----BEGIN PUBLIC KEY-----", "")
.replace("-----END PUBLIC KEY-----", "")
.replaceAll("\\s", "");
KeyFactory keyFactory = KeyFactory.getInstance("EC");
X509EncodedKeySpec x509EncodedKeySpec = new X509EncodedKeySpec(Base64.getDecoder().decode((base64PublicKey)));
PublicKey publicKey = keyFactory.generatePublic(x509EncodedKeySpec);
Cipher cipher = Cipher.getInstance("ECIESwithSHA512/NONE/NoPadding");
IESParameterSpec iesParamSpec = new IESParameterSpec(null, null, 256);
cipher.init(1, publicKey, iesParamSpec);
return Base64.getEncoder().encodeToString(cipher.doFinal(payLoad.getBytes()));
}
}
The Java Decryption Code:
package encrypt;
import java.security.InvalidAlgorithmParameterException;
import java.security.InvalidKeyException;
import java.security.KeyFactory;
import java.security.NoSuchAlgorithmException;
import java.security.PrivateKey;
import java.security.PublicKey;
import java.security.Security;
import java.security.Signature;
import java.security.SignatureException;
import java.security.spec.InvalidKeySpecException;
import java.security.spec.PKCS8EncodedKeySpec;
import java.security.spec.X509EncodedKeySpec;
import java.time.Instant;
import java.util.Base64;
import java.util.HashMap;
import java.util.Map;
import javax.crypto.BadPaddingException;
import javax.crypto.Cipher;
import javax.crypto.IllegalBlockSizeException;
import javax.crypto.NoSuchPaddingException;
import org.bouncycastle.crypto.digests.Blake2bDigest;
import org.bouncycastle.jce.provider.BouncyCastleProvider;
import org.bouncycastle.jce.spec.IESParameterSpec;
import org.json.JSONObject;
public class Decrypt {
public static void main(String[] args) throws Exception{
// encrypted payload
String input = "{\"payload\":\"BlZphrMQwR/8s6VKQtciQtZGRWH07Lgk7IJWk1QIywmq7WbV0myS8oH+URn1RJziEJ4nC5ShODAF2iOOTt8gPgsXT23Gkq4orFqiyL2I\",\"protectedInfo\":\"a2V5SWQ9IlNPMjl8U0EyOXxlY2RzYSIsYWxnb3JpdGhtPSJlY2RzYSIsY3JlYXRlZD0iMTc0OTQ2MTkxMzc0MSIsZXhwaXJlcz0iMTc0OTQ2MjIxMzc0MiIsaGVhZGVycz0iIChjcmVhdGVkKShleHBpcmVzKWRpZ2VzdCI=\"}]}";
JSONObject jsonObject = new JSONObject(input);
String payload = createDecryptionCipher(jsonObject.getString("payload"));
String protectedInfo = jsonObject.getJSONArray("signatures").getJSONObject(0).getString("protectedInfo");
String decodedProtectedInfo = new String(Base64.getDecoder().decode(protectedInfo.getBytes()));
String signature = jsonObject.getJSONArray("signatures").getJSONObject(0).getString("signature");
System.out.println("decodedProtectedInfo : " + decodedProtectedInfo);
System.out.println("payload : " + payload);
System.out.println("signature : " +signature);
//NPCI's public key
String encodedPublicKey = "MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEE849Sv9oPZ1UksUhifFc7GDrUf5E0sl5urrBSATJat6bee/M6/QXxeItt/HYcPFQjYZ/3AX6QNoeGxPq4TQcOA==";
PublicKey publicKey = convertToPublicKey(Base64.getDecoder().decode(encodedPublicKey));
boolean result = validateSignature(signature, decodedProtectedInfo, payload, publicKey);
if(result){
System.out.println("signature validation successfully");
System.out.println("payload: " + payload);
}else{
System.out.println("signature validation failed");
}
}
public static PublicKey convertToPublicKey(byte[] encodePubKey) throws NoSuchAlgorithmException, InvalidKeySpecException {
KeyFactory keyFactory = KeyFactory.getInstance("EC");
X509EncodedKeySpec x509EncodedKeySpec = new X509EncodedKeySpec(encodePubKey);
return keyFactory.generatePublic(x509EncodedKeySpec);
}
private static String createDecryptionCipher(String input) throws NoSuchPaddingException, NoSuchAlgorithmException, InterruptedException,
InvalidKeySpecException, InvalidAlgorithmParameterException, InvalidKeyException, IllegalBlockSizeException, BadPaddingException {
Security.addProvider(new BouncyCastleProvider());
Cipher cipher = Cipher.getInstance("ECIESwithSHA512/NONE/NoPadding");
// partner's private key
String privateKey = "MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQg51083sOVh5xvX7cgK/pF3/v6hMhZFGYxMYoJwYNLtFuhRANCAAQRrYyuKce6K9TSQTeHInHGCir3r80OJPDwrEjoD5ZdJK1aspQ/Atgd3MrDlnfXivLoP2WmQxl+VM/QF4IhlHC7";
IESParameterSpec iesParamSpec = new IESParameterSpec(null, null,256);
cipher.init(2, getDecryptedPrivateKey(privateKey), iesParamSpec);
return new String(cipher.doFinal(Base64.getDecoder().decode(input.getBytes())));
}
private static PrivateKey getDecryptedPrivateKey(String privateKey) throws NoSuchAlgorithmException, InvalidKeySpecException {
KeyFactory keyFactory = KeyFactory.getInstance("EC");
PKCS8EncodedKeySpec privateKeySpec = new PKCS8EncodedKeySpec(Base64.getDecoder().decode(privateKey));
return keyFactory.generatePrivate(privateKeySpec);
}
public static boolean validateSignature(String signature, String protectedInfo, String payload, PublicKey key)
throws NoSuchAlgorithmException, SignatureException, InvalidKeyException {
Map<String, String> timeStamp = getTimeStamps(protectedInfo);
validateExpiryTime(timeStamp.get("expiryTime"));
byte[] sign = Base64.getDecoder().decode(signature.getBytes());
byte[] hashValue = blakeHashing(payload);
String concatenated = "(created):" + timeStamp.get("createdTime") + "\n"
+ "(expires):" + timeStamp.get("expiryTime") + "\n" + "digest:BLAKE2b-512="
+ Base64.getEncoder().encodeToString(hashValue);
return verify(concatenated.getBytes(), sign, key);
}
private static boolean verify(byte[] hashValue, byte[] signature, PublicKey publicKey) throws NoSuchAlgorithmException,
InvalidKeyException, SignatureException {
Signature verifier = Signature.getInstance("SHA512withECDSA");
verifier.initVerify(publicKey);
verifier.update(hashValue);
return verifier.verify(signature);
}
public static byte[] blakeHashing(String input){
System.out.println(input);
// Create Blake2b digest
Blake2bDigest blakeHash = new Blake2bDigest(512);
blakeHash.update(input.getBytes(), 0, input.getBytes().length);
byte[] hashByte = new byte[blakeHash.getDigestSize()];
blakeHash.doFinal(hashByte, 0);
return hashByte;
}
private static void validateExpiryTime(String s) throws SignatureException {
long currentTime = Instant.now().toEpochMilli();
if (currentTime > Long.parseLong(s)) {
System.out.println("Signature Expired");
throw new SignatureException("Signature Expired");
}
}
private static Map<String, String> getTimeStamps(String decodedProtectedInfo) throws SignatureException {
String[] protectedVal = decodedProtectedInfo.split(",");
String[] currentTimeString = protectedVal[2].split("=");
String currentTimeStamp = currentTimeString[1].substring(1, currentTimeString[1].length() - 1);
String[] expireTimeString = protectedVal[3].split("=");
String expireTimeStamp = expireTimeString[1].substring(1, expireTimeString[1].length() - 1);
Map<String, String> map = new HashMap<>();
map.put("createdTime", currentTimeStamp);
map.put("expiryTime", expireTimeStamp);
return map;
}
}
The java equivalent C# code:
public static string DecryptECIES(string base64Input, ECPrivateKeyParameters privateKey)
{
byte[] inputBytes = Convert.FromBase64String(base64Input);
int ephLen = 65, tagLen = 64;
int ctLen = inputBytes.Length - ephLen - tagLen;
if (ctLen <= 0) throw new Exception("Invalid input.");
byte[] ephPubBytes = inputBytes.Take(ephLen).ToArray();
byte[] ciphertext = inputBytes.Skip(ephLen).Take(ctLen).ToArray();
byte[] tag = inputBytes.Skip(ephLen + ctLen).ToArray();
byte[] derivation = new byte[0];
byte[] encoding = new byte[0];
byte[] L = GetLengthTag(encoding);
ECDomainParameters ecParams = privateKey.Parameters;
Org.BouncyCastle.Math.EC.ECPoint q = ecParams.Curve.DecodePoint(ephPubBytes);
ECPublicKeyParameters ephPubKey = new ECPublicKeyParameters(q, ecParams);
ECDHBasicAgreement agreement = new ECDHBasicAgreement();
agreement.Init(privateKey);
BigInteger sharedSecret = agreement.CalculateAgreement(ephPubKey);
byte[] agreementBytes = Arrays.Concatenate(ephPubBytes, BigIntegers.AsUnsignedByteArray(agreement.GetFieldSize(), sharedSecret));
Kdf2BytesGenerator kdf = new Kdf2BytesGenerator(new Sha512Digest());
kdf.Init(new KdfParameters(agreementBytes, derivation));
byte[] KEnc = new byte[ciphertext.Length];
byte[] KMac = new byte[64];
byte[] K = new byte[KEnc.Length + KMac.Length];
kdf.GenerateBytes(K, 0, K.Length);
Array.Copy(K, 0, KMac, 0, KMac.Length);
Array.Copy(K, KMac.Length, KEnc, 0, KEnc.Length);
HMac mac = new HMac(new Sha512Digest());
mac.Init(new KeyParameter(KMac));
mac.BlockUpdate(ciphertext, 0, ciphertext.Length);
mac.BlockUpdate(encoding, 0, encoding.Length);
mac.BlockUpdate(L, 0, L.Length);
byte[] calcTag = new byte[mac.GetMacSize()];
mac.DoFinal(calcTag, 0);
if (!calcTag.SequenceEqual(tag))
throw new CryptographicException("MAC check failed. Data may be tampered.");
byte[] plaintext = new byte[ciphertext.Length];
for (int i = 0; i < ciphertext.Length; i++)
plaintext[i] = (byte)(ciphertext[i] ^ KEnc[i]);
return Encoding.UTF8.GetString(plaintext);
}
I have been working on a piecewise linear approximation to Euclidian distance as a hobby project and I came up with this formula:
approximation of c = max(a, b) + (sqrt(2)-1)*min(a, b)
for coordinates [x1,y1] and [x2,y2], a = abs(x1-x2) and b = abs(y1-y2)
The upside of the approximation is that it is exact if: a = b, a = 0 or b = 0.
The downside is that the potential error gets bigger the further the points are from each other.
I hope this can inspire you
docker run -P --publish 127.0.0.1:5432:5432 -e POSTGRES_PASSWORD=Welcome123# --name my-pg postgres```
For my case, I chose to implement HTTP-only cookies, as I see it as a much safer way to handle JWT, because it prevents hackers from performing XSS attacks (stealing the content of any cookie in document.cookie).
With HTTP-only cookies, It's harder for malicious scripts to extract the token and send it to an attacker.
I passed the AI‑102 certification exam in one attempt with help from Passcerthub.com! The questions were nearly identical to the real exam absolutely worth every penny. https://www.passcerthub.com/microsoft/ai-102-dumps.html
PowerShell is for managing systems, and creating build and deployment scripts.
Except you found SDK released by SAP which is demostrating how to use PowerShell with Crystal , my recomendation is to stick with the supported methods. Working with Crystal is tricky even when you use .NET or Java, which are the supported frameworks. I understand that PowerShell is using .NET in the background and may allow you some workarounds but then you will need to discover the hot water by yourself. In addition, it will be a security risk, since anybody can change it and print the connection information. How you are going to implement antitampering with PowerShell?
I found a way which maybe is useful for anybody.
As I want to change the deleted field, restrictions have to be removed:
To handle complicated SQL situations the add() function is appropriate.
Thus, this code works:
$queryBuilder = $this->connectionPool->getQueryBuilderForTable(tx_myevents_domain_model_event);
$queryBuilder
->getRestrictions()
->removeAll();
$queryBuilder
->update(tx_myevents_domain_model_event)
->add( 'where', '(datetime_end + (days_until_deletion * 86400)) < UNIX_TIMESTAMP()')
->set('deleted', '1');
Have you ever found a solution to this? ("The request is invalid. Details: The property 'value' does not exist on type 'Microsoft.Azure.Search.OutputFieldMappingEntry' or is not present in the API version '2025-05-01-preview'.)
While importing the CSV file using the wizard in MySQL Workbench:
Set the data type of boolean fields to TINYINT(1)
Remove the header row (column names) from the file
Ensure the data in the file follows the exact column order as defined in the table
Use 0 and 1 to represent boolean values
You can control the rounded corners by using the property border-radius. You can set a single radius to make circular corners, or two radii to make elliptical corners. border-radius:(value) 0 0 (value)/(value) 0 0 (value) or use this
border-top-left-radius:(value-1) (value-2);
border-bottom-left-radius:(value-1) (value-2);
this will set a different border radius on each side of the corner try some values till you reach your shape.
about the gradient I think you will need to make an element above you picture and set a gradient to it with an opacity of color
try adding a height to tab container
import { useEffect, useRef } from "react"; import { motion } from "framer-motion";
export default function MinecraftChallengeShort() { const videoRef = useRef(null);
useEffect(() => { if (videoRef.current) { videoRef.current.play(); } }, []);
return ( <div className="w-full h-screen bg-black flex items-center justify-center"> <div className="max-w-md w-full aspect-video relative rounded-2xl overflow-hidden shadow-2xl"> <video ref={videoRef} className="w-full h-full object-cover" src="/minecraft-challenge.mp4" // 🔁 ضع رابط الفيديو هنا بعد الإخراج muted loop playsInline /> <motion.div initial={{ opacity: 0, y: 40 }} animate={{ opacity: 1, y: 0 }} transition={{ delay: 0.5, duration: 0.8 }} className="absolute bottom-0 left-0 right-0 bg-gradient-to-t from-black/80 to-transparent p-4 text-white text-sm" > <p className="font-bold text-lg mb-1"> أنا حطيت نفسي في تحدي مجنون... ممنوع ألمس الأرض في ماينكرافت! </p> <p className="mb-1">وكل بلوك بلمسه بيختفي 😰</p> <p className="mb-1">بس بص اللي حصل لما قابلت كريبر! 💥</p> <p className="text-yellow-400 mt-2">لو عايز جزء تاني… دوس لايك وسبلي تحدي أصعب! 🔥</p> </motion.div> </div> </div>
); }
There is no conversion function in the Oracle Database to generate Shape files from SDO_GEOMETRY. However, the PL/SQL package SDO_UTIL provides conversion functions from/to GML, KML, WKT/WKB, GeoJSON, or JSON, hence well-known and often used standards for spatial data.
I faced this issue. In my case, my dev machine's firewall was on. Turning the firewall off (System Settings -> Network -> Firewall) got the expo app to work in the iOS simulator.
I had the issue when I write C/C++ codes. I just disable the GitHub Copilot and GitHub Copilot Hub, which solve the problem. Check your VSCode extension tab.
Just stumbled across https://github.com/gigerIT/vuetify-inertia-link : This is a simple Vue plugin that enables the use of Inertia links with Vuetify components. It adds support for the Inertia route() helper function within the "to" prop of Vuetify components.
I have just found the laziest possible approach lol.
Just enter this:
print(context.widget.runtimeType);
Then get the Boolean value using the following:
context.widget.runtimeType == LoginPage;
Based on this, I created this function which pops the dialogue if the current screen / Widget is LoginScreen() where the function is called, otherwise the function takes us to the loginscreen.
onOKPressed: () {
final bool isLoginPage = Navigator.of(context).canPop() &&
context.widget.runtimeType == LoginPage;
if (isLoginPage) {
Navigator.pop(context);
} else {
Navigator.pushAndRemoveUntil(
context,
CupertinoPageRoute(builder: (_) => const LoginPage()),
(route) => false,
);
}
}
I am a beginner learning Flutter so far, but still hope this helps lol.
The solution is to create the URI with
URI uri = URI.create(signedUrl);
client.get() .uri(uri).retrieve().body(Resource.class)
instead of string.
As per numpy documentation, the np.array doesnt have buffer input parameter irrespective of the homogenous or heterogeneous datatypes.
But, np.ndarray does have the buffer input parameter option.
Sharing the code with revision with np.ndarray.
import numpy as np
from multiprocessing import shared_memory
my_dtype = [('name', 'U10'), ('age', 'i4'), ('weight', 'f4')]
data = [('Rex', 9, 81.0), ('Fido', 3, 27.0)]
shm = shared_memory.SharedMemory(create=True, size=np.array(data, dtype=my_dtype).nbytes)
shared_array = np.ndarray(len(data), dtype=my_dtype, buffer=shm.buf)
shared_array[:] = data
OK, UserForm is opening on the monitor where VBA editor was last time used(closed). Usually when we do some project we place excel on one monitor and editor on the other for more visual space. After work you save and close editor. But VBA remembers that position, but not the position of editor exactly, more like position regarding to Excel. So, if you open excel on either of the monitors, userform or editor opens on other monitor! That's how this issue begins to happen!
So easiest solution without any code, is to open editor(Alt+F11) and drag it to the same side as excel is, and close it. Than open userform.
If you want to force two decimals (for example, 5.60 instead of 5.6) you need to update the last line to:
print(f"Each person should pay £{final_amount:.2f}")
I suggest you to learn about print function.
Replace your print with this: print(f"Each person should pay: £{final_amount:.2f}")
Very bad answer.
In fact, you should specify the packet patch, not the groupId and artifactId. They may or may not match. Nothing will work when specifying packages for scanning in this way.
Simple add this forceConsole: true option for the winston.transports.Console. Like this:
new winston.transports.Console({
forceConsole: true,
...
})
This works as of today
keep-sorted can handle this:
# keep-sorted start
- # ::;waf::web application firewall
trigger: ";waf"
replace: "web application firewall"
- # ::;m$::Microsoft
trigger: ";m$"
replace: "Microsoft"
- # ::;sf::Salesforce
trigger: ";sf"
replace: "Salesforce"
# keep-sorted end
Source: example setup.
The reconnectStrategy is defined as
reconnectStrategy?: false | number | ReconnectStrategyFunction;
Setting it to false will disable the reconnects, another option would be to write your own ReconnectStrategyFunction.
There's a whole section about it in the client-configuration.md
Creating a new Backup Template fixed the problem for me.
I was using the "Default" Backup Template. Using a different one (same settings) was resolving the permission denied issue with ZIP.
replace this
ndkVersion = "26.3.11579264"
with this:
ndkVersion = "27.0.12077973"
Use either the ESC-key to exit focus mode or you can toggle focus- or browse-mode via NVDA-key+spacebar. This should do exactly what you need - at least it does it for me. See https://download.nvaccess.org/documentation/userGuide.html#BrowseMode
So… I change all my columns in table that they can be NULL and use Model for data in my code and now it’s working fine. Before I tried to use Model and I follow official documentation for swift client with another table and when I insert from studio rows to table I fill all columns and have success, I try it with my main table but I saw that not all my columns are filled and fill it by my self and then it starts working
I had the same issue, Nothing else worked for me, I removed the firebase sdk from swift package manager and added through pods and that set.
I was facing an issue with the react-native-google-mobile-ads package. What resolved the problem for me was removing the caret (^) from the version number in package.json.
Originally, I had:
"react-native-google-mobile-ads": "^15.4.0"
I changed it to:
"react-native-google-mobile-ads": "15.4.0"
The solution that we found was to make sure that the certificate was marked as exportable, with this done and then regenerated through Digicert things worked without issue.
We stripped back the permissions to just get and list for the secrets and certificates and it continues to work. So that was unnecessary.
Have you figured out a solution? I'm also an osu student and having the exact same problem on they exact same server.
I had a similar problem. Unfortunately adding data-bs-pause="false" did not work for me as well as a few other ways that I have tried.
I was following options, methods and events on this page at the bottom: https://getbootstrap.com/docs/5.2/components/carousel/
The only way I have manage to pause it was:
const carousel_elem = document.getElementById("carouselID")
const carousel = new bootstrap.Carousel(carousel_elem, {
interval: 5000,
pause: "hover",
ride: "carousel",
wrap:true,
touch:true,
keyboard:true
})
//make sure these classes are added to your carousel element or else your next and previous buttons will not work.
$("#carouselID").addClass("carousel slide carousel-fade")
//if you want it to cycle
carousel.cycle()
// if needed to pause it when button was pressed add:
carousel.pause(true)
carousel._config.ride=false
carousel._config.wrap=false
//to unpause it:
carousel.pause("hover")
carousel._config.ride="carousel"
carousel._config.wrap=true
Yes, it has been fixed. But Ubuntu 20.04 doesn't 'see' anything modern enough to download/update.
Ironically, with the [faulty] compiler I'm using,
my_uint8_t / 2
generates the code I want (single 8-bit shift),
Whilst my "better than dividing by 2 code" I used back in the 80's,
my_uint8_t >> 1
generates more (and unnecessary) assembler code.
I think when opening the DRF web browser it always use get.. when I try to use curl -H some-token -X DELETE http://127.0.0.1:8000/api/classroom/delete/1/ it will actually delete
For anyone also still having this issue, I got it working by creating a 'NuGet' service connection instead of 'Azure Repos/Team Foundation Server'
Make sure to fill in the 'Feed URL' field with the link that Azure Artifacts provides when you select "Connect to Feed" and the "NuGet" option in the Azure Artifacts tab.
The PAT token you create for the 'Personal Access Token' needs have Package read permissions and it needs to be created on the organization that's hosting the Azure Artifacts feed.
I provided a link from Azure Documentation that goes into detail on how to publish a NuGet package onto another Azure Artifacts feed.
https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/nuget?view=azure-devops&tabs=yaml
This link is for publishing, so the permissions given were for publishing, to read just simply change those to read, like I previously mentioned.
I dont see any answer to this, I am also facing a problem like this now. I have psql dbs across different servers each db has multiple schemas and I am trying to pipeline the data to a single db for analysis.
If you found solution for this please help.
thanks
The behaviour was due to the theme used in the browser (Firefox): I was using default theme (and the system theme is dark), so that it was actually using dark theme, even if Gitlab is set to use light theme. Selecting light theme also for the browser, the diagram displays with the correct colors.
Now you can try OOMOL Studio, where you can freely use Python or Node.js modules.
If any necessary blocks are missing, you can implement them with code yourself or check if there are suitable ones in the community.
37
I too have a strange problem with PostgreSQL performance for a query.The query that runs below 1s in for certain perod and runs for 20s for some period.but the data volume and every other details remains same for the query that runs for two time period.
If vacuum happen for the table then it should continue in the follow for the entire day right but in the one half its 1s and another half its 20s.How to handle this ?what will be root cause for this
So, I didn't find a solution on YDB side, but looks like it is possible to configure testContainers.
Next configuration saved me:
private static final GenericContainer<?> ydbContainer = new GenericContainer<>(
DockerImageName.parse("cr.yandex/yc/yandex-docker-local-ydb:latest"))
.withCreateContainerCmdModifier(cmd -> {
HostConfig hostConfig = cmd.getHostConfig();
hostConfig.withPortBindings(
new PortBinding(
Ports.Binding.bindPort(grpcPort),
new ExposedPort(grpcPort)
)
);
cmd.withHostConfig(hostConfig);
})
.withExposedPorts(grpcPort)
.withCreateContainerCmdModifier(cmd -> cmd.withHostName("localhost"))
.withExtraHost("ydb-node", "127.0.0.1")
.withEnv("GRPC_PORT", String.valueOf(grpcPort))
.withEnv("YDB_USE_IN_MEMORY_PDISKS", "1");
we define some available grpcPort and bind all internal hostnames to localhost. Also, we set YDB GRPC_PORT to available grpc port on our machine. This way all discovered nodes will have host localhost and port - grpcPort.
It is not the solution I want, but it is workaround
I’m facing a strange issue with PostgreSQL query performance. The same query runs in under 1 second during certain periods, but takes around 20 seconds at other times. However, the data volume and other parameters remain the same across both time periods.
If a vacuum has been performed on the table, I would expect consistent performance throughout the day. But in this case, the query runs in 1 second during one half of the day and 20 seconds during the other half.
How can this be handled? What could be the root cause of this behavior?
This is now possible with native CSS nesting.
https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_nesting
Just be sure to have semicolon ; before the nested selector, or else the whole block is invalid
.foo {
color: red;
> .bar {
color: blue;
}
&:before {
color: green;
}
}
I just found a PR when a dev used useMemo instead of useCallback to memoize a function. It triggered my curiosity.
Are these two functions inside a React component/hook identical?
let b = 1;
const memoizedFunction = useMemo(
() => (a) => (a + b),
[b]
);
const memoizedFunction2 = useCallback(
(a) => (a + b),
[b]
);
At first glance, that appears to achieve the same goal.
Please mention your requirement clearly. And you cannot continue playing sound if iOS app is terminated.
To avoid this warning when using XML-based configuration, you need to add use-authorization-manager="true" to the <http> block, as described in the Spring Security 5.x to 6.x migration guide.
Just be careful about the answer provided by @sarin.
If you have any tables that reference that primary key as a foreign key, you will loose those links and perhaps even those records if you have cascade on delete rules.
What you should do rather is re-think your table design. If there is a need to change a field that is designated as the primary key, then perhaps that field is not a good candidate for a primary key.
So as a step by step, for your situation (I realize this is 10 years later), but this is what you should do for others that might have this issue.
1. Change the primary key to be your auto-increment field (or add it if it doesn't exist) (eventid as per above)
2. Create a UNIQUE index on the field that was the primary key (jobid as per above)
Your foreign keys should still be in tact. If the above fails (depends on your database), you may need to first remove all the foreign keys and recreate them afterwards. Be sure to keep the rules in tact (on delete, on update).
If you have an active database, this will all need to be done as a transaction.
I think it's generally not recommended to use ASP.NET Zero with Blazor. Zero is tightly related to Angular and you would need to rewrite many things to make it work
As already mentioned by @bassxzero it is not recommended on component-level.
But I also understand why you like the idea. It looks kind of readable and cool. However, I think the mentioned "prop-drilling" is necessary here, makes the behavior much more clear and won't be too bad.
For instance you could have a component that contains itself a set of focusable elements. I think in those cases you want the component to decide what should be focused.
Here is an example.
If all of that didn't convince you, you could consider to write a little more logic into your directive that finds the first element in the given element tree that is of type "input", "select" etc. or has the "tabIndex" property set and its value is >= 0.
I'm running transmission-daemon 4.0.6 (38c164933e) which is very recent, same problem. It did work some time but stopped working. I have also not figured out how to fix this. The permissions are correct and the user/group are the same as transmission-daemon.
Hi there if someone is still facing the issue. Simply check the node version you are one node -v .
I got this accidentally when I forgot to change my terminal node version which I used as 16.x.x. for my another project and for this project it was supposed to be 20.x.x
I have the same error when it's trying to resolve path aliases.
For my case, i add --project tsconfig.json in the command to load proper config and it works.
In your case, it would be:
"typeorm": "ts-node --project tsconfig.json -r tsconfig-paths/register ./node_modules/.bin/typeorm"
... or you can create another tsconfig file e.g. tsconfig.typeorm.js and add --project tsconfig.typeorm.json in your command.
You can try the Syncfusion Vue Diagram component. It lets you create interactive diagrams with drag-and-drop nodes and connectors, perfect for visualizing flows like the one you shared.
It also supports serializing the entire diagram to JSON using built-in APIs.
For more detailed information, refer to the following resources:
Demo: https://ej2.syncfusion.com/vue/demos/#/bootstrap5/chart/over-view.html
Documentation: https://ej2.syncfusion.com/vue/documentation/chart/vue-3-getting-started
Docs on serialization: https://ej2.syncfusion.com/vue/documentation/diagram/serialization
Syncfusion offers a free community license to individual developers and small businesses.
Note: I work for Syncfusion.
You're working with patient visit data over time and want to predict an outcome for each visit by looking at what happened during previous visits. That’s a common setup in time-based healthcare modeling. While XGBoost doesn’t “remember” sequences like some deep learning models, you can help it learn from the past by creating smart features that summarize previous visits.
Sort Your Data
Add Lag Features
Add Rolling or Cumulative Stats
Patient-Specific
Handle Missing Values
Split Carefully
If the main goal is to detect whether only the most recent point is anomalous in a univariate time series, your current two-step CNN approach (global + local) is a bit overcomplicated and delicate due to the daily masking/cleaning loop.
Alternatively:
A. Use a rolling forecast error approach:
Develop a model (e.g., ARIMA, LSTM, or even a simple moving average) to predict the next point. Then:
python
Copy-edit
error = abs(actual[-1] - predicted[-1])
is anomaly = error > threshold
Cultivate the limit using a rolling error distribution or confidence interval (e.g., mean + 3*std of past residuals).
B. Statistical test or z-score on residuals:
Establish a baseline model (even just a rolling mean), then for the latest value:
python
CopyEdit
residual = actual[-1] - rolling_mean[-1]
z_score = residual / rolling_std[-1]
is_anomaly = abs(z_score) > 3
Have you seen this?
https://bootcamptoprod.com/brotli-compression-in-spring-boot/
Maybe it was even written by you? ;-)
=LET(a, UNIQUE(A2:A9),b, BYROW(a, LAMBDA(r, JOIN(",",FILTER(B2:B9,A2:A9 =r)))), HSTACK(a,b))
Another option, if you're okay with scanning the array twice, is to use vpmaxsd/vpminsd to find the minimum/maximum high 32 bits, then search for the lower half using a vpcmpeqd/vptest loop. Probably only a win if the array fits in L1.
Using htop seems to be a really good solution.
colima ssh
# Once inside
# Update the dependencies
sudo apt update
# Install htop
sudo apt install htop -y
# run htop
htop
This gives a good interface to get the live CPU, Memory usage.
Jetpack Compose Desktop doesn’t natively support setting a window as a desktop background or panel (like Qt/Gtk does), since it runs on top of JVM AWT/Swing and doesn’t expose low-level Wayland/X11 window controls.
Use Wrap Widget instead.
Wrap(
children: [
Text(chipText),
SizedBox(width : 10),
IconButton(
icon: Icon(Icons.clear),
onPressed: (){},
)
],
),
Ein Plugin in Eclipse zu installieren ist nicht genug.
Zuerst soll YourKit installiert werden: https://www.yourkit.com/java/profiler/download/
Von der Installation wird man zur Plugininstallation in Eclipse geleitet: Plugin installieren, bei Dir wurde schon vorher gemacht.
Und dann läuft YourKit von Eclipse aus.
The error persists to this day, considering changes to other software...
What worked for me: Step 1: from the Extensions remove Copilot enter image description here Step 2: Now the option to Hide Copilot is available enter image description here
The trick is to use --output tsv in combination with --query then you don't need grep and cut as suggested by https://stackoverflow.com/a/55485500/1080523
Example:
> az account list --query '[].state' --output tsv
Enabled
Enabled
Enabled
Enabled
Enabled
Make sure to run the Add-WebConfiguration after the web.config is deployed.
Reasoning: Add-WebConfiguration saves the changes in the web.config file. I assume those changes were immeditaley overwritten by replacing (deploying) the web.config file
I tried this ways but I have this problem too.
WARNING: Skipping TensorFlow as it is not installed.
The mistake I have done was I was uploading the p12 certificate of apple development certificate and distribution certificate , instead I should have downloaded the APNs certificate from developer.apple.com site and installed it on my PC and then extract the p12. if anyone is doing same mistake as me , u might find this helpful
Once a session exceeds the token limit, the oldest messages get trimmed out so the model can focus on the recent ones. Once a session exceeds the model’s token limit, it starts forgetting the earliest parts of the conversation to make room for newer messages. - by ChatGPT itself
Cause it dose not exist. the letter "g" dose not exist in apple
So there not enough information to help you, but there are 2 main reason there maybe no fonts in PDF:
Open the project in Google Cloud Console, and navigate to API & Services > Credentials > Create Credentials > Choose API. You can copy the API key in a pop-up. That's all.
$('#select').select2({
templateResult: formatOption,
templateSelection: formatOption
});
function formatOption(option) {
if (!option.id) return option.text;
return $(`<span><img src="${option.element.dataset.img}" style="width: 20px;"/> ${option.text}</span>`);
}
<select id="select">
<option data-img="avatar1.jpg">John</option>
<option data-img="avatar2.jpg">Jane</option>
</select>
Based on @John Bollinger answer + googling about colorizing error outputs from make I've stumbled across following solution in the makefile documentation:
ifdef REVISION
$(info "REVISION is ${REVISION}")
ERR = :
else
ERR = $(error "REVISION is unspecified...")
endif
all: err <some other prerequisities>
.PHONY: err
err: ; ERR
.PHONY: clear
clear:
rm -r build
According to my understanding, the first prerequisite "err" in the "all" target gets executed first. From that perspective the "ERR" variable is expanded and executed.
However, this solution might not solve what @John Bollinger pointed out and that is if someone tries to execute an intermediate file or so.
However, my case is, that the REV propagates into source code from makefile via CFLAGS and if it's not defined, the code doesn't get compiled, as there is detection for revision. So both solutions are acceptable for me. :-)
The way the query is written, it will "detoast" many times the same JSON.
Here are some explanations and a workaround:
https://dev.to/mongodb/jsonb-detoasting-read-amplification-4ikj
The way the query is written, it will "detoast" five times the same JSON.
Here are some explanations and a workaround:
https://dev.to/mongodb/jsonb-detoasting-read-amplification-4ikj
"Kay Plaisir – Nou rechaje telefòn! Kouri vin pran sèvis ou! A pati jiyè, nou ouvè chak jou depi 8 AM pou 9 PM. Rele nan 4082-2549 oswa 3110-0392. Pa bliye: Johnsly_709 la pou ou!"
-
As others have said, there is no single solution. I have written a Julia library (HiddenFiles.jl) which attempts to be complete, but there are a lot of edge cases (especially for macOS, with different types of hidden files and constantly changing APIs. More information about the functionality of this algorithm can be found here.
I also have the same problem. Based on this link https://developers.facebook.com/docs/marketing-api/reference/ads-action-stats/, meta provided some parameters for developers to pull the appointments scheduled data. I tried to use schedule_total and schedule_website since the ads campaign is based on external website/landing page, and none of them works. It's been a year now, so perhaps you found the answer. I will be very grateful if you are willing to share it with the rest of us
yes I meet this problem too ,my topic : __consumer_offsets replicas is 1 , so I change the replicas is 3, 3 is my broker count, then i start my kafka cluster and kill one , then I go to prometheus look my kafka-exporter , it status is up. so the problem is solved
just use fbprophet if you just want to finish the task and don't want any working knowledge
I have created a Datasnap Server that access various databases. You can have Pooled connections to the databases. I personnaly use devart SDAC components to access the databases. But I think pooled connections should work with Firedac.
On each Datasnap method that access a database, I instanciate a connection to the database. At the end of the procedure, I free the connection. In this scenario and with pools activated, the number of real connections to the db will NOT be huge. See https://docwiki.embarcadero.com/CodeExamples/Sydney/en/FireDAC.Pooling_Sample
Did You Don't Forget To Install Ninja Or add The linux mint Envorement Path to Where the ninja executable located?
If the app is (to be) written in react native, react-native-stockfish-android library can be used.
Having said that, it was made to work with Stockfish version 15, and at the time of this posting, the most recent version is 17, so not sure how forward compatible it is.
Setting dir=auto globally across all websites, like YouTube, isn't as straightforward as you might hope because browsers are built to display pages how the developers intended, following web standards to keep everything consistent. There aren't built-in settings to change HTML attributes everywhere due to concerns about security, performance, and how well it would work on complex sites. If browsers let you make such broad changes, it could slow things down or cause odd issues, especially on sites with dynamic content or intricate designs. But if you're comfortable with a bit of tech tinkering, browser extensions or user scripts can be a handy workaround. Tools like Tampermonkey or Greasemonkey allow you to run custom JavaScript on web pages, so you can set dir=auto for text elements where it makes sense. These scripts work on a per-page basis, giving you the freedom to target specific parts without messing up the whole site.
you can try install locale-all
I think the issue might be due to how venv and pip handle temporary files on Windows. Personally, I’d recommend switching to Conda instead — it’s more stable for installing PyTorch on Windows and avoids many of these file-locking or permission issues.
http-proxy-middleware version issue
Thanks for sharing that custom component example, Milan. I'm also working with react-big-calendar and needed a way to reorder the event display in Day View. Your MyEvent component approach makes sense and looks easy to implement. Just curious, do you know if there's a way to customize the tooltip display too when using a custom event component like this? I'd like to either disable it completely or show different content there.
To display only recently added record in gallery , set Form's properties as shown below.
SubmitForm(Form1);Set(varLastRecord,Form1.LastSubmit.OrderID);ResetForm(Form1)
and then set the items property of gallery as
Filter(DropDownDemos,OrderID=varLastRecord)
Try Change Your Compiler To mingw64 or mingw32. Why? Because curl Compiled Using mingw Compiler
Can you correct this line.
"#EXTM3U
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-ALLOW-CACHE:YES
#EXT-X-TARGETDURATION:16
#EXTINF:15.035711,
D:\Movie\.4B30FB2AF40B47787ADACE7B733B52B5.m3u8_0/0.ts
#EXTINF:2.001711,
D:\Movie\.4B30FB2AF40B47787ADACE7B733B52B5.m3u8_0/1.ts
#EXTINF:3.002711,
D:\Movie\.4B30FB2AF40B47787ADACE7B733B52B5.m3u8_0/2.ts
#EXTINF:9.050711,
D:\Movie\.4B30FB2AF40B47787ADACE7B733B52B5.m3u8_0/3.ts
#EXTINF:4.003711,
For me simply using psql -l was not working for me initially, so I had to
sudo su postgres
and then run
psql -c '\l'