Well, for what it's worth, I have put something together that achieves what I hope will work tomorrow when I have live trade data.
I'm new to Python and it would be interesting to get some feedback on the below code. I'm sure there's a ton of optimizations that can be had. I find pandas and dataframes to be subjects I don't have a handle on.
import yfinance as yf
import pandas as pd
from concurrent.futures import ThreadPoolExecutor, as_completed
import time
import datetime
import pytz
def fetch_stock_hist_data(ticker):
try:
stock = yf.Ticker(ticker)
hist_data = stock.history(period="1d", interval='1m')
# add symbol column
hist_data.insert(0, 'Symbol', ticker)
# only use the last 5 results
hist_data = hist_data.tail()
hist_data.reset_index(drop=True, inplace=True)
one_min_vol = 0
two_min_vol = 0
five_min_vol = 0
# Iterate using range
for i in range(len(hist_data)):
five_min_vol += hist_data.iloc[i].to_dict().get('Volume')
if i > 2:
two_min_vol += hist_data.iloc[i].to_dict().get('Volume')
if i > 3:
one_min_vol += hist_data.iloc[i].to_dict().get('Volume')
hist_last_row = hist_data.iloc[[-1]]
new_df = pd.DataFrame(hist_last_row)
drop_columns = ['Open', 'High', 'Low', 'Close', 'Volume', 'Dividends', 'Stock Splits']
new_df = new_df.drop(columns=drop_columns)
# Add columns for 1, 2 and 5 minute volumes
new_df.insert(1, 'Lst1MinVol', one_min_vol)
new_df.insert(2, 'Lst2MinVol', two_min_vol)
new_df.insert(3, 'Lst5MinVol', five_min_vol)
return new_df
except Exception as e:
print(f"Error fetching data for {ticker}: {e}")
return None
def fetch_curr_stock_data(ticker):
info = yf.Tickers(ticker).tickers[ticker].info
data = [ticker, f"{info['currentPrice']}", f"{info['volume']}"]
return data
def fetch_multiple_stocks(tickers):
with ThreadPoolExecutor() as executor:
futures = [executor.submit(fetch_stock_hist_data, ticker) for ticker in tickers]
results = []
for future in as_completed(futures):
result = future.result()
if result is not None:
results.append(result)
return pd.concat(results)
def fetch_curr_stocks(tickers):
table_title = ['Symbol', 'Price', 'TotVolume']
prevVol_df = pd.DataFrame(columns = ['Symbol', 'PrevVolume'])
with ThreadPoolExecutor() as executor:
while True:
df = pd.DataFrame(columns = table_title)
results = list(executor.map(fetch_curr_stock_data, tickers))
# Adds items from results
for result in results:
df.loc[len(df)] = result
# Convert TotVolume from string to number
df['TotVolume'] = pd.to_numeric(df['TotVolume'], errors='coerce')
# Copy volume data for each symbol to a new df.
prevVol_df = df[['Symbol', 'TotVolume']].copy()
prevVol_df.rename(columns={'TotVolume': 'PrevVolume'}, inplace=True)
# Create a new df by merging df and prevVol_df
tmp_df = pd.merge(df, prevVol_df, on='Symbol', how='left')
curr_volume = tmp_df['TotVolume'].astype(int) - tmp_df['PrevVolume'].astype(int)
tmp_df.insert(2, 'CurrVol', curr_volume)
return tmp_df
if __name__ == "__main__":
new_york_tz = pytz.timezone('America/New_York')
tickers = ["AAPL", "GOOG", "MSFT"]
# tickers = ["AAPL"]
while True:
# Get current time and format as 09:30:00
time_object = datetime.datetime.now(new_york_tz)
curr_time = time_object.strftime('%H:%M:%S')
# Get stock info for tickers
df_curr = fetch_curr_stocks(tickers)
# Get stock historical data for last 5 minutes today.
df_hist = fetch_multiple_stocks(tickers)
#########################
# Merge df_curr and df_hist
cols_to_copy = df_hist[['Lst1MinVol', 'Lst2MinVol', 'Lst5MinVol']]
# Merge df_hist and df2 on col0 to ensure data integrity
merged_df = pd.merge(df_curr, df_hist[['Symbol', 'Lst1MinVol', 'Lst2MinVol', 'Lst5MinVol']], on='Symbol', how='left')
#########################
# Clean up dataframe data
new_order = ['Symbol', 'Price', 'CurrVol', 'Lst1MinVol', 'Lst2MinVol', 'Lst5MinVol', 'TotVolume', 'PrevVolume']
final_df = merged_df[new_order]
# Get rid of 'PrevVolume' column
final_df = final_df.drop(final_df.columns[-1], axis=1)
# Insert time stamp as a new column
final_df.insert(1, 'Time', curr_time)
# Write data to csv file
final_df.to_csv('/tmp/yf_data.csv', mode='a', header=False, index=False)
# Output:
# AAPL,22:01:16,222.91,0,979377,1403850,2299514,63519990
# GOOG,22:01:16,172.65,0,387421,727605,1237449,21385165
# MSFT,22:01:16,410.37,0,432180,558932,861389,23745361
# Format volumes data with thousdand separator for readability when printing to screen
final_df['CurrVol'] = final_df['CurrVol'].apply(lambda x: f"{x:,}")
final_df['Lst1MinVol'] = final_df['Lst1MinVol'].apply(lambda x: f"{x:,}")
final_df['Lst2MinVol'] = final_df['Lst2MinVol'].apply(lambda x: f"{x:,}")
final_df['Lst5MinVol'] = final_df['Lst5MinVol'].apply(lambda x: f"{x:,}")
final_df['TotVolume'] = final_df['TotVolume'].apply(lambda x: f"{x:,}")
print(final_df)
# Output:
# Symbol Time Price CurrVol Lst1MinVol Lst2MinVol Lst5MinVol TotVolume
# 0 AAPL 22:06:38 222.91 0 979,377 1,403,850 2,299,514 63,519,990
# 1 GOOG 22:06:38 172.65 0 387,421 727,605 1,237,449 21,385,165
# 2 MSFT 22:06:38 410.37 0 432,180 558,932 861,389 23,745,361
time.sleep(10)
To fix this, I added @mutable property to the Network Description (.ned) file of LinearMobility under inet/mobility/single/. In order to change the parameters during runtime, the parameter should be mutable. Mutable parameters can be set to a different value during runtime
I can reproduce the same error when I try to add a MI by its Object ID.
If you do the same, please add it to Azure DevOps by its Name or Application ID.
Go to your tenant's Enterprise applications page, change Application type to All Applications, search your target MI by its Object ID.
Add it to Azure DevOps by Name or Application ID.
CREATE TABLE Persons ( PersonID int, LastName varchar(255), FirstName varchar(255), Address varchar(255), City varchar(255) );
Is there a solution to this problem now? I have encountered this problem before and it has been resolved. But I encountered it again and it didn't work well. You can refer to this article for details and try it out
https://blog.csdn.net/m0_66975650/article/details/143039495?spm=1001.2014.3001.5501
I hope this is helpful, I got stuck on the same problem but with the help of ChatGTP and Claude AI, I was able to come across one possible solution.
I am using localhost in this example and tailwind CSS in a MERN Stack project.
-------------------------------Passport Setup--------------------------------
import passport from "passport";
import { Strategy as GoogleStrategy } from "passport-google-oauth20";
import User from "../models/user.model.js";
import dotenv from "dotenv";
dotenv.config();
// Configure Passport with a Google strategy for authentication
passport.use(
"google",
new GoogleStrategy(
{
clientID: process.env.GOOGLE_CLIENT_ID,
clientSecret: process.env.GOOGLE_CLIENT_SECRET,
callbackURL: "/api/auth/google/callback",
},
/**
* Verify the user's credentials using Google.
*
* This function is called by Passport when a user attempts to log in with their Google account.
* It:
* 1. Searches for a user with the provided Google ID.
* 2. If no user is found, it creates a new user with information from the Google profile.
* 3. Returns the user object.
* 4. Passes any errors to the `done` callback.
*
* @param {string} accessToken - The access token provided by Google.
* @param {string} refreshToken - The refresh token provided by Google.
* @param {Object} profile - The user's profile information from Google.
* @param {Function} done - The callback to call with the authentication result.
*/
async (accessToken, refreshToken, profile, done) => {
try {
let user = await User.findOne({ googleId: profile.id });
// Additional check to prevent duplicate accounts if Google email changes
if (!user) {
user = await User.findOne({ email: profile._json.email });
}
if (!user) {
// Generate a random password
const randomPassword = User.prototype.generateRandomPassword();
// Create a new user
user = await User.create({
googleId: profile.id,
name: profile._json.name,
email: profile._json.email,
password: randomPassword, // Set the generated password
profilePicture: profile._json.picture,
});
}
return done(null, user);
} catch (error) {
return done(error, false);
}
}
)
);
/**
* Serialize the user for the session.
*
* This function is called when a user is authenticated. It:
* 1. Takes the user object and stores the user ID in the session.
* 2. This ID is used to identify the user in subsequent requests.
*
* @param {Object} user - The authenticated user object.
* @param {Function} done - The callback to call with the serialized user ID.
*/
passport.serializeUser((user, done) => {
done(null, user.id);
});
/**
* Deserialize the user from the session.
*
* This function is called on each request to retrieve the user object based on the user ID stored in the session. It:
* 1. Finds the user by their ID.
* 2. Passes the user object to the `done` callback.
* 3. Passes any errors to the `done` callback if the user cannot be found.
*
* @param {string} id - The user ID stored in the session.
* @param {Function} done - The callback to call with the user object or an error.
*/
passport.deserializeUser(async (id, done) => {
try {
const user = await User.findById(id);
done(null, user);
} catch (err) {
done(err);
}
});
export default passport;
------------------------------- Auth Controller --------------------------------
import passport from "../lib/PassportSetup.js";
import User from "../models/user.model.js";
/**
* Initiates Google authentication.
*
* This function handles initiating the Google OAuth2 authentication process by:
* 1. Redirecting the user to Google's OAuth2 login page.
*
* @param {Object} req - The request object for initiating Google authentication.
* @param {Object} res - The response object to redirect the user to Google.
* @param {Function} next - The next middleware function in the stack.
*/
export const googleAuth = passport.authenticate("google", {
scope: ["profile", "email"],
});
/**
* Handles the callback from Google OAuth2.
*
* This function handles the callback after the user has authenticated with Google. It:
* 1. Uses Passport's 'google' strategy to authenticate the user.
* 2. Redirects the user to the home page on successful authentication.
* 3. Handles authentication errors by redirecting to the login page with an error message.
*
* @param {Object} req - The request object containing Google OAuth2 callback data.
* @param {Object} res - The response object to redirect the user.
* @param {Function} next - The next middleware function in the stack.
*/
export const googleAuthCallback = (req, res, next) => {
passport.authenticate("google", {
successRedirect: `${process.env.CLIENT_URL}/oauth/callback`,
failureRedirect: `${process.env.CLIENT_URL}/login`,
failureFlash: true,
})(req, res, next);
};
/**
* Handles successful authentication callbacks from OAuth providers.
*
* This function is triggered when a user is successfully authenticated via an OAuth provider (e.g., Google, GitHub).
* It:
* 1. Checks if a user object is present on the request, which is set by Passport after successful authentication.
* 2. Responds with a 200 status and user information if authentication is successful.
* 3. Includes the user's ID, name, email, profile picture, and role in the response.
*
* @param {Object} req - The request object, containing authenticated user data.
* @param {Object} res - The response object used to send back the authentication result.
* @param {Function} next - The next middleware function in the stack (not used in this function).
* @returns {Object} JSON object with user data on success, or an error status if authentication fails.
*/
export const authCallbackSuccess = (req, res, next) => {
return res.status(200).json({
success: true,
status: 200,
user: {
id: req.user.id,
name: req.user.name,
email: req.user.email,
profilePicture: req.user.profilePicture,
role: req.user.role,
},
});
};
------------------------------- Auth Routes --------------------------------
import express from "express";
import {
googleAuth,
googleAuthCallback,
authCallbackSuccess,
} from "../controllers/auth.controller.js";
const router = express.Router();
// Passport Google OAuth2 login
router.get("/google", googleAuth);
// Handles Passport Google OAuth2 callback
router.get("/google/callback", googleAuthCallback);
// Returns the user object after Passport Google OAuth2, Github, or any other callback
router.get("/callback/success", isAuthenticated, authCallbackSuccess);
export default router;
------------------------ React OAuthButtons.jsx -------------------------
import React from 'react';
import { useSelector } from 'react-redux';
function OAuthButtons() {
const { loading } = useSelector((state) => state.user);
const handleOAuth = (provider) => {
window.location.href = `http://localhost:4000/api/auth/${provider}`;
}
return (
<div className='flex flex-col gap-3'>
<button
className="bg-red-700 text-white rounded-lg p-3 uppercase hover:bg-red-600 disabled:bg-red-400"
type="button"
onClick={() => handleOAuth("google")}
disabled={loading}
>
Continue with Google
</button>
<button
className="bg-blue-700 text-white rounded-lg p-3 uppercase hover:bg-blue-600 disabled:bg-blue-400"
type="button"
onClick={() => handleOAuth("github")}
disabled={loading}
>
Continue with Github
</button>
</div>
);
}
export default OAuthButtons;
------------------------ React OAuthCallback.jsx -------------------------
import React, { useEffect } from 'react';
import axios from 'axios';
import { useDispatch } from 'react-redux';
import { useNavigate } from 'react-router-dom';
import { loginStart, loginSuccess, loginFailure } from '../../redux/user/userSlice.js';
function OAuthCallback() {
const dispatch = useDispatch();
const navigate = useNavigate();
useEffect(() => {
const handleCallback = async () => {
try {
dispatch(loginStart());
const response = await axios.get(
`http://localhost:4000/api/auth/callback/success`,
{ withCredentials: true }
);
dispatch(loginSuccess({ user: response.data.user }));
navigate('/');
} catch (error) {
dispatch(loginFailure({
error: error.response?.data?.message || "Login using Google failed! Please try using email and password!"
}));
navigate('/login');
}
};
handleCallback();
}, [dispatch, navigate]);
return (
<div className="flex items-center justify-center min-h-screen">
<div className="animate-spin rounded-full h-12 w-12 border-t-2 border-b-2 border-red-700"></div>
</div>
);
}
export default OAuthCallback;
------------------------ React App.jsx -------------------------
import React from 'react';
import { BrowserRouter, Routes, Route } from "react-router-dom";
import NavigationBar from './components/Navigation/NavigationBar.jsx';
import Home from './pages/Static/Home.jsx';
import About from './pages/Static/About.jsx';
import Register from './pages/Auth/Register.jsx';
import Login from './pages/Auth/Login.jsx';
import OAuthCallback from './components/Auth/OAuthCallback.jsx';
function App() {
return (
<BrowserRouter>
<NavigationBar />
<Routes>
<Route path='/' element={<Home />} />
<Route path='/about' element={<About />} />
<Route path='/register' element={<Register />} />
<Route path='/login' element={<Login />} />
<Route path="/oauth/callback" element={<OAuthCallback />} />
</Routes>
</BrowserRouter>
)
}
export default App;
------------------------ React (Sample Implementation) Login.jsx -------------------------
import React, { useState } from 'react';
import { Link, useNavigate } from 'react-router-dom';
import axios from "axios";
import { useDispatch, useSelector } from 'react-redux';
import { loginStart, loginSuccess, loginFailure } from '../../redux/user/userSlice.js';
import OAuthButtons from '../../components/Auth/OAuthButtons.jsx';
function Login() {
const [formDate, setFormData] = useState({
email: "",
password: "",
});
const navigate = useNavigate();
const dispatch = useDispatch();
const { loading, error } = useSelector((state) => state.user);
const handleChange = (e) => {
setFormData({ ...formDate, [e.target.id]: e.target.value });
}
const handleSubmit = async (e) => {
e.preventDefault();
try {
dispatch(loginStart());
const response = await axios.post("http://localhost:4000/api/auth/login", formDate, { withCredentials: true });
const data = await response.data;
dispatch(loginSuccess({ user: data.user }));
navigate("/profile");
} catch (error) {
dispatch(loginFailure({ error: error.response?.data?.message || "An unexpected error occurred. Please try again." }));
}
}
return (
<div className='p-3 max-w-lg mx-auto'>
<h1 className='text-3xl text-center font-semibold my-7'>Login</h1>
<form onSubmit={handleSubmit} className='flex flex-col gap-4'>
<input type="email" placeholder='Email' id='email' className='bg-slate-100 p-3 rounded-lg' onChange={handleChange} />
<input type="password" placeholder='Password' id='password' className='bg-slate-100 p-3 rounded-lg' onChange={handleChange} />
<button type='submit' disabled={loading} className='bg-slate-700 text-white p-3 rounded-lg uppercase hover:opacity-95 disabled:opacity-75 cursor-pointer'>
{loading ? "Loading..." : "Login"}
</button>
<div className='border-b'></div>
<OAuthButtons />
</form>
<div className='flex gap-2 mt-5'>
<p>Don't an account?</p>
<span className='text-blue-500'><Link to={"/register"}>Register</Link></span>
</div>
<div>
<p className='text-red-700'>{error}</p>
</div>
</div>
)
}
export default Login;
Zendesk database is distributed, so no ACID properties. As noted here, it can take a few minutes to Zendesk Support index new and modified tickets: Zendesk Support search reference
More explanations on this help center article: Search API Delayed Results vs Users API
Run the following command:
flutter config --jdk-dir <path_to_jdk>
If you rerun flutter doctor --verbose
after this you should see it now updated.
Using dynamic import()
syntax, which is a function and allows you to conditionally import modules at runtime. Here’s how to implement it:
import { mainURL } from '../../support/helperFunctions.js';
let dataModule;
if (mainURL.includes('dev')) {
dataModule = import('../../../data/dataDev.js');
} else {
dataModule = import('../../../data/data.js');
}
// Use an async function to handle the dynamic import and access the module's exports
async function loadData() {
const { coreData, lastUpdated } = await dataModule;
console.log(coreData, lastUpdated);
// You can now use coreData and lastUpdated as needed
}
// Call loadData() to trigger the import
loadData();
import()
returns a promise, so we can use await
to get the module's exports.import()
is asynchronous, you need to wrap it in an async function to handle the promise.(async () => {
if (somethingIsTrue) {
// import module for side effects
await import("/modules/my-module.js");
}
})();
Refer to this article: Dynamic Imports MDN
Friend,
Do we have exact steps you have followed in detail please. Somehow I am unable to do similar setup. I am stuck with failed Keyclock page stating "error=client_not_found, reason=Cannot_match_source_hash"
Zendesk is discontinuing the use of URL & Branded Targets in favor of webhooks.
Take a look on the documentation of Webhooks here: Zendesk Webhooks
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss
When working with a Microsoft Access database that is linked to SharePoint lists, you may encounter issues when trying to write to the database using Java, particularly with the UCanAccess library. The error messages you are seeing suggest that the linked SharePoint lists are causing the database to behave in a read-only manner.
Here are some potential solutions to address this issue:
Link to the SharePoint list directly to ensure all lookup tables are present in Access.
Delete the linked table for the SharePoint list.
Re-link to the SharePoint view using the ImportSharePointList macro action.
Going to File > Options > Current Database.
In the Caching Web Service area, select the "Never Cache" checkbox for Microsoft SharePoint tables.
Check Permissions: Ensure that you have the necessary permissions to write to the SharePoint lists. If your permissions are set to read-only, you will not be able to perform write operations.
Connection String: Verify that the connection string you are using in your Java application is correctly formatted and points to the right SharePoint list. Any errors in the connection string can lead to the warnings you are seeing.
Use the Latest Version of UCanAccess: Make sure you are using the latest version of the UCanAccess library, as updates may include bug fixes and improvements for working with SharePoint.
By addressing these areas, you should be able to resolve the issues you are facing when trying to write to an Access database linked to SharePoint lists.
if you are using angular 18 , Go inside main.ts and make following changes
bootstrapApplication(AppComponent, {providers: [ provideHttpClient( withInterceptors([ exampleInterceptorInterceptor]), ) ]}).catch((err) => console.error(err));;
Rolling back the extension to v14.1 will mitigate the issue before there is any fix regarding this.
As @jitter answered, if using kind = BlobStorage
, the accessTier in properties is required. reference
resource storageAccount1 'Microsoft.Storage/storageAccounts@2023-05-01' = {
name: 'satest111'
location: 'westus'
sku: {
name: 'Standard_LRS'
}
kind: 'BlobStorage'
properties: {
accessTier: 'Cold'
}
}
Try to use kind = 'StorageV2'
to make it easy if the kind not mandatory, v2 type is also recommanded.
resource storageAccount2 'Microsoft.Storage/storageAccounts@2023-05-01' = {
name: 'satest222'
location: 'westus'
sku: {
name: 'Standard_LRS'
}
kind: 'StorageV2'
}
here is a sample reference
Once you have the body set to 100% and the main set to 100% you can try using a negative margin to bring it in, I know it sounds like a cheap trick and honestly I am not the best developer as I am rather new, but I had a bootstrap project that for some reason had the same issue and using a negative margin was the only thing that worked for me.
app.js
import mysql from 'mysql';
import express from 'express';
import bodyParser from 'body-parser';
import session from 'express-session';
const app = express();
// Middleware setup
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.static('public'));
app.use(session({ secret: 'your-secret-key', resave: false, saveUninitialized: false }));
app.set('view engine', 'ejs');
// Create a MySQL connection pool
const con = mysql.createPool({
host: "localhost",
user: "root",
password: "",
database: "ecommerce_db",
});
// Utility function to execute database queries
function queryDatabase(query, params) {
return new Promise((resolve, reject) => {
con.query(query, params, (error, results) => {
if (error) {
reject(error);
} else {
resolve(results);
}
});
});
}
// Utility function to calculate total and prepare cart data
function calculateCartDetails(cart, productResults) {
const productMap = {};
productResults.forEach(product => {
productMap[product.id] = product;
});
let total = 0;
cart.forEach(item => {
const product = productMap[item.productId];
if (product) {
total += product.price * item.quantity; // Calculate total
}
});
return { productMap, total };
}
// Product Routes
app.get('/', async (req, res) => {
try {
const products = await queryDatabase('SELECT * FROM products', []);
res.render('pages/product', { products });
} catch (error) {
console.error('Database query error:', error);
return res.status(500).send('Internal Server Error');
}
});
// Cart Routes
app.get('/cart', async (req, res) => {
const cart = req.session.cart || []; // Get cart from session
if (cart.length === 0) {
return res.render('pages/cart', { cart, total: 0 });
}
const productIds = cart.map(item => item.productId);
const placeholders = productIds.map(() => '?').join(',');
try {
const productResults = await queryDatabase(`SELECT id, name, price FROM products WHERE id IN (${placeholders})`, productIds);
const { productMap, total } = calculateCartDetails(cart, productResults);
res.render('pages/cart', { cart, productMap, total });
} catch (error) {
console.error('Database query error:', error);
return res.status(500).send('Internal Server Error');
}
});
// Add to Cart (POST)
app.post('/cart/add', (req, res) => {
const productId = req.body.productId;
const quantity = parseInt(req.body.quantity, 10) || 1;
// Initialize cart if it doesn't exist
if (!req.session.cart) {
req.session.cart = [];
}
// Check if the product is already in the cart
const existingProduct = req.session.cart.find(item => item.productId === productId);
if (existingProduct) {
existingProduct.quantity += quantity; // Update quantity
} else {
req.session.cart.push({ productId, quantity }); // Add new product
}
res.redirect('/cart'); // Redirect to cart
});
// Remove from Cart (POST)
app.post('/cart/remove', (req, res) => {
const productId = req.body.productId;
// Filter out the product to remove
req.session.cart = req.session.cart.filter(item => item.productId !== productId);
res.redirect('/cart'); // Redirect to cart
});
// Payment Routes
app.get('/payment', async (req, res) => {
const cart = req.session.cart || [];
if (cart.length === 0) {
return res.redirect('/'); // Redirect to products if cart is empty
}
const productIds = cart.map(item => item.productId);
const placeholders = productIds.map(() => '?').join(',');
try {
const productResults = await queryDatabase(`SELECT id, name, price FROM products WHERE id IN (${placeholders})`, productIds);
const { productMap, total } = calculateCartDetails(cart, productResults);
res.render('pages/payment', { cart, productMap, total });
} catch (error) {
console.error('Database query error:', error);
return res.status(500).send('Internal Server Error');
}
});
// Payment Route
app.post('/payment', async (req, res) => {
const { orderId, total } = req.body;
try {
// Render the payment page with order details
res.render('pages/payment', { orderId, total });
} catch (error) {
console.error('Payment processing error:', error);
return res.status(500).send('Internal Server Error');
}
});
// Checkout Route (POST)
app.post('/checkout', async (req, res) => {
const { orderId, paymentMethod } = req.body;
try {
// Update order status to "paid" and save the payment method
await queryDatabase('UPDATE orders SET status = ?, payment_method = ? WHERE id = ?', ['paid', paymentMethod, orderId]);
// Clear the cart after payment
req.session.cart = [];
// Render a success page or send a confirmation message
res.render('pages/checkout', { orderId, paymentMethod });
} catch (error) {
console.error('Error updating payment status:', error);
return res.status(500).send('Internal Server Error');
}
});
// Start the server
app.listen(3001, () => {
console.log('Server running on http://localhost:3001');
});
// Order Route
app.get('/order', async (req, res) => {
const cart = req.session.cart || [];
if (cart.length === 0) {
return res.redirect('/'); // Redirect to products if cart is empty
}
const productIds = cart.map(item => item.productId);
const placeholders = productIds.map(() => '?').join(',');
try {
// Retrieve product details for the cart items
const productResults = await queryDatabase(
`SELECT id, name, price FROM products WHERE id IN (${placeholders})`,
productIds
);
const { productMap, total } = calculateCartDetails(cart, productResults);
// Insert order into the database with status "not paid"
const orderResult = await queryDatabase(
'INSERT INTO orders (status, total) VALUES (?, ?)',
['not paid', total]
);
const orderId = orderResult.insertId;
// Insert each cart item as an order item linked to the order ID
await Promise.all(
cart.map(item =>
queryDatabase(
'INSERT INTO order_items (order_id, product_id, quantity, price) VALUES (?, ?, ?, ?)',
[orderId, item.productId, item.quantity, productMap[item.productId].price]
)
)
);
// Clear the cart after placing the order
req.session.cart = [];
// Render confirmation page with order details
res.render('pages/order_confirmation', { orderId, total });
} catch (error) {
console.error('Database query error:', error);
return res.status(500).send('Internal Server Error');
}
});
cart.ejs
<body>
<h1>Your Cart</h1>
<ul>
<% cart.forEach(item=> { %>
<li>
Product: <%= productMap[item.productId].name %> <br>
Price: $<%= productMap[item.productId].price.toFixed(2) %> <br>
Quantity: <%= item.quantity %> <br>
Total: $<%= (productMap[item.productId].price * item.quantity).toFixed(2) %><br>
<form action="/cart/remove" method="POST" style="display:inline;">
<input type="hidden" name="productId" value="<%= item.productId %>">
<button type="submit">Remove</button>
</form>
</li>
<% }) %>
</ul>
<h2>Total Price: $<%= total.toFixed(2) %>
</h2>
<a href="/order">Place Order</a>
<a href="/">Continue Shopping</a>
</body>
</html>
checkout.ejs
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Payment Success</title>
</head>
<body>
<h1>Pa
yment Successful!</h1>
<p>Thank you for your payment for Order #<%= orderId %>.</p>
<p>Your order status is now: Paid</p>
<a href="/">Return to Home</a>
</body>
</html>
order_confirmation.ejs
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Order Confirmation</title>
</head>
<body>
<h1>Order Confirmation</h1>
<p>Thank you for your order!</p>
<p>Order ID: <%= orderId %></p>
<p>Total Amount: $<%= total.toFixed(2) %></p>
<p>Status: Not Paid</p>
<!-- Payment Button -->
<form action="/payment" method="POST">
<input type="hidden" name="orderId" value="<%= orderId %>">
<input type="hidden" name="total" value="<%= total %>">
<button type="submit">Proceed to Payment</button>
</form>
</body>
</html>
payment.ejs
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Payment</title>
</head>
<body>
<h1>Payment for Order #<%= orderId %></h1>
<p>Total Amount: $<%= parseFloat(total).toFixed(2) %></p>
<!-- Payment Form with Payment Method Selection -->
<form action="/checkout" method="POST">
<input type="hidden" name="orderId" value="<%= orderId %>">
<h3>Select Payment Method:</h3>
<label>
<input type="radio" name="paymentMethod" value="Credit Card" required> Credit Card
</label><br>
<label>
<input type="radio" name="paymentMethod" value="PayPal" required> PayPal
</label><br>
<label>
<input type="radio" name="paymentMethod" value="Bank Transfer" required> Bank Transfer
</label><br>
<button type="submit">Complete Payment</button>
</form>
</body>
</html>
product.ejs
<body>
<header>
<%- include('../partials/header') %>
</header>
<main>
<h1>Products</h1>
<ul>
<% products.forEach(product => { %>
<li>
<h2><%= product.name %></h2>
<p>Price: $<%= product.price %></p>
<p>Description: <%= product.description %></p>
<img src="<%= product.image %>" alt="<%= product.name %>" />
<form action="/cart/add" method="POST">
<input type="hidden" name="productId" value="<%= product.id %>">
<input type="number" name="quantity" value="1" min="1">
<button type="submit">Add to Cart</button>
</form>
</li>
<% }) %>
</ul>
</main>
<footer>
<%- include('../partials/footer') %>
</footer>
</body>
</html>
CREATE TABLE orders (
id INT AUTO_INCREMENT PRIMARY KEY,
status VARCHAR(20),
total DECIMAL(10, 2),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE order_items (
id INT AUTO_INCREMENT PRIMARY KEY,
order_id INT,
product_id INT,
quantity INT,
price DECIMAL(10, 2),
FOREIGN KEY (order_id) REFERENCES orders(id),
FOREIGN KEY (product_id) REFERENCES products(id)
);
ALTER TABLE orders ADD COLUMN payment_method VARCHAR(50);
Is there a way to friend all the tests within a class?
Currently, the library google.cloud.secretmanager
doesn't support importing though module-info.java
. The Cloud Java SDK for Secret Manager doesn't work in a modularized project. The current known workaround is to repackage the JAR files. See thread: https://github.com/googleapis/google-cloud-java/issues/10975
Thanks for the reply, indeed the solution is to place these files with different extensions in different folders.
with com.github.skydoves:powerspinner:1.2.7 only add
languageSpinner.setIsFocusable(true)
For the above mentioned problem, To render conditionally I would highly suggest to use conditional parallel routes(Note: parallel routes don't work as expected for dynamic routes), But for a normal route it works like a charm and is part of the layout.
Also, if possible I would suggest you to use server component and rely on server functions for auth related checks and flows. This would possible help you in following ways
P.S: I understand that the passed child of client component can be a server component.
Second, if you have different route paths for authenticated and un-authenticated users, then as suggested by @lalit-fauzdar in his answer, use middleware to dynamically routes the users to the respective routes.
P.S: It would be helpful if you could share the error shown.
"delprioritychar" hint was beautiful. But it does works with "of DEL" only and not with "of CURSOR"
on webflow, this can be solved by selecting "disable responsiveness" inside the image setting.
You might rely on below code snippet from lekman.com showing logic app deployment. There he depends on bicep deploy but its outputs mechanisms are compliant: Access output variables by name reference
By using a combination of tail and head you should achieve it
Use head by getting all but the X last line of a file named file.txt .
head -n -X file.txt
It should work for .bash_history too and this could do the trick ?
head -n -X .bash_history > .bash_history
Use tail to keep the Y last line of a file named file.txt .
tail -n Y file.txt
you could combine this to have a script that create 2 temp files then merge them to .bash_history
I had an issue in my dependency injection, there were to compose instances, one being rendered and one being updated.
If you're using Socket.IO (common in NestJS), make sure you chose the "Socket.IO" option and not the basic "WebSocket" request. That was my issue.
HOW CAN I RECOVER MY FUNDS ON BINARY OPTIONS-WIZARD ASSET RECOVERY In my quest for redemption, I scoured the internet for solutions to recover my lost bitcoin. From self-proclaimed experts to dubious recovery services, I explored every avenue in the hopes of finding a way to reclaim what was rightfully mine. It felt like an endless maze of false promises and dead ends. But just when I was about to lose hope, I stumbled upon Wizard Asset Recovery- a beacon of hope in a sea of scams. They distinguished themselves from the rest right away because to their knowledge of digital asset recovery and dedication to supporting people like myself. Their team was made up of knowledgeable experts who were familiar with the nuances of the blockchain, and I was interested in finding out more about how they approached the problem. Upon contacting Wizard Asset Recovery, I was greeted by a friendly and knowledgeable representative who took the time to understand my situation. They asked detailed questions to gather information about the loss and assessed the feasibility of recovery. This initial consultation gave me confidence that I was in capable hands and that my case would be treated with utmost professionalism. Wizard Asset Recovery is a go to tool for the best results for assistance. Get in Touch with Wizard Asset Recovery: Website: [email protected]
We are seeing the same thing occasionally on our MAUI app. It only happens on Android and I believe it is related to the following open bug.
Alex from Kinde here. Apologies for the late reply!
Does Kinde store user data on their own servers?
Yes. We need to store the data to provide our customer auth as a service to you.
If so, does this mean Kinde owns the user data?
No. Your data is owned by you. Kinde is a custodian of that data on your behalf. "You own all data, information or content you and your Authorized Users upload into the Platform (Your Data), as well as any data or information output from the Platform using Your Data as input (Output Data)". This is set our in our Terms, see section 16 of https://docs.kinde.com/trust-center/agreements/terms-of-service/
Are there any potential security or privacy issues with this setup?
This is a balance that you need to decide yourself. The risk of setting up auth in house and having to maintain security updates and feature requests vs relying on a third party company who does this for a living.
Being biased on this answer, we think Kinde is an excellent choice to rely on your customer auth needs. We are constantly updating our product based on feedback so that we meet our customer's expectations, regularly go through security and compliance tests to ensure that we're as secure as possible, and have an awesome team to keep us at the top of our game. This is all done so that you can focus on building your product rather than tinkering with auth.
Hope that answers your question. Feel free to reach out to us via the Slack community or the live chat on our website.
Cheers, Alex
Any project that you want to receive changes when the starter kit is updated must be a branch of the starter kit. I would create the started kit project and create a branch for starter kit updated and a branch for each of the projects you use the starter kit to create. When you commit a change from the starter kit update you would need to push them to any projects under the starter kit that you wanted to receive the update. Make sure that only the starter kit update branch updates the starter kit project.
You need to create
app_name/migrations/__init__.py
for every app you have.
or instead you can run:
python manage.py makemigrations app_name
In my case, its because my objectVersion = 70 then i change it into 56 and its works to pod init and pod install.
You can find this objectVersion:
Are you looking to attach the csv output in the email? If so you can't do that. This medium article provides an alternative option. https://medium.com/@sanusa100/how-to-email-snowflake-data-as-attachments-without-attaching-files-e2844669a5a9
As you probably noticed C & C++ are different languages.
Is it possible to make it compile when included into C++ code without making the struct type declaration untested?
Yes it is:
#include <stdio.h>
struct foo {
struct bar {
int baz;
} bar;
};
int main() {
struct foo::bar a; // Use foo::bar to refer to the nested struct
a.baz = 2;
printf("%d", a.baz);
return 0;
}
The query will not look like that. PostgREST will write a CTE query for most of these. The syntax you have from the SDK isn't directly related to SQL, it's a Rest API url builder. If you want to see the query you can look in your Postgres logs inside the Supabase dashboard. Supabase also has a tool for converting SQL to the REST API https://supabase.com/docs/guides/api/sql-to-rest, however they don't have a tool that does the opposite.
If you look at the blue lines below the error you can trace back to the script where your getting this error. The error is basically telling you it cannot find the key your looking for of that players data, either you mistyped the key's name (its case sensitive as always) when you used the :Get , :Set or :Update function (i dont know which one from just the one line of error you gave me) or you need to just give it a few minutes (thats how it worked out for me). I dont know why you posted this on stack overflow use roblox dev forum instead, I also dont know why Im replying after 9 months
The Kinde SDK is upto v1.3.1 now, are you able to retry and see if you are still seeing the issue?
Also, check that you have the crypto-js dependency installed. The SDK requires crypto-js version 3.3.0.
You can move your adjust the CSS in footer class like following style
.footer {
color: #ffffff;
padding: 1.5rem 0;
text-align: center;
transition: background-color 0.3s ease;
height: 16%;
border: 15px solid red;
position: absolute;
bottom: 0;
}
Finally, this helped me! Xcode 15.4, and the app crashes at runtime insisting that NSCameraUsageDescription hasn't been included in info.plist, even though it has. I also didn't realise I could add a row to Info > Custom iOS Target Properties, since the + symbol used on the Build Settings tab isn't visible. Turns out that right-clicking as described above and choosing Privacy - Camera Usage adds NSCameraUsageDescription to the properties list. Bingo! Thanks
Have a look at inheritance https://www.doctrine-project.org/projects/doctrine-orm/en/3.3/reference/inheritance-mapping.html
It takes care of FKs and relations for you
Removing the leading (1 or more) whitespace first would do the trick:
" 1234 ".strip().length()
I might not have exactly what your looking for but you can use this to get the information at least, you can then use it to scope it differently for your need :)
# Retrieve all private endpoints in the subscription
$privateEndpoints = Get-AzPrivateEndpoint
# Check if private endpoints exist
if ($privateEndpoints.Count -eq 0) {
Write-Host "No private endpoints found in this subscription."
exit
}
# Loop through each private endpoint and output information
foreach ($endpoint in $privateEndpoints) {
Write-Output "Resource Group: $($endpoint.ResourceGroupName)"
Write-Output "Private Endpoint Name: $($endpoint.Name)"
# Initialize an array to store FQDNs
$fqdnList = @()
# Loop through private link service connections to build FQDNs
foreach ($connection in $endpoint.PrivateLinkServiceConnections) {
if ($connection.GroupIds -and $connection.GroupIds.Count -gt 0) {
foreach ($group in $connection.GroupIds) {
$fqdnList += "$($group).privatelink.$($connection.Name).azure.net"
}
}
}
# Display FQDNs or "None found" if empty
if ($fqdnList.Count -gt 0) {
Write-Output "FQDNs:"
foreach ($fqdn in $fqdnList) {
Write-Output " - $fqdn"
}
} else {
Write-Output "FQDNs: None found"
}
# Retrieve the private IP addresses from network interfaces
Write-Output "IP Addresses:"
$networkInterface = Get-AzNetworkInterface -ResourceId $endpoint.NetworkInterfaces.Id
# Loop through IP configurations to fetch private IP addresses
$ipAddresses = $networkInterface.IpConfigurations | ForEach-Object { $_.PrivateIpAddress }
if ($ipAddresses.Count -gt 0) {
foreach ($ip in $ipAddresses) {
Write-Output " - $ip"
}
} else {
Write-Output " - None found"
}
Write-Output "-----------------------------------------"
}
Hope this is helpful and remember shared knowledge is the best knowledge 😊 Best Regards, Timmy Malmgren
If the Answer is helpful, please click "Accept Answer" and upvote it as it helps others to find what they are looking for faster!
I have same issue using MVC .NET 4.7.2
but solved by adding these configuration to web.config
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<remove name="UrlRoutingModule-4.0" />
<add name="UrlRoutingModule-4.0" type="System.Web.Routing.UrlRoutingModule" />
</modules>
</system.webServer>
yes, I completed the certifcation process today with Spectrum Shades. What did you need to know?
Make a logical calculation like a child divide 12÷5. Consider a longer digit 123456÷54321 and start with len function, this will enable you to identify that there are two folfs of 54321 in 123456. So seperate each digit of 54321 and multiply each with 2 (keep 10th adding to the next level). Now index the result and subtract from the index of 123456 individual digits. Make a program for 12÷5 and you will be able to divide large numbers upto google numbers and above.
I recently tried figuring out the math behind Gimp's Colour Erase blend mode (basically the same as the Colour to Alpha filter), and this is what I ended up with:
[C++ Code]
[Detailled explanations of how it works.]
The gist of it is: With a fully opaque bottom layer, Normal Blend is just a linear interpolation. With both layers fully opaque, Colour Erase is the inverse operation of that Normal Blend.
In your colour space of choice, trace a line that passes through both the Top and Bottom colours. Starting from Bottom, walk that line away from Top until it you meet the border of the unit cube (i.e, the limit your colour gamut). That point is your resulting colour. Its opacity is the proportion of the distance top-bottom, compared to the distance top-result.
I wrote a package for your need, try read the example section and implement that:
Under windows 11, the filedialog is so "old fashioned". But, I heard, in the next release, a "modern" version will be available. The current filedialog doesn't even show the correct starting folders, and is unusable in a prod environment.
Maybe use Runtime parameters?
"Runtime parameters let you have more control over what values can be passed to a pipeline. With runtime parameters you can:
Supply different values to scripts and tasks at runtime"
Chatted with OneSignal and they recently changed the API to use REST Api key instead of User Auth Key.
You need to set the following classes
html,body {height:100%;}
and main {height:100%;}
Exiftool's QuickTime Tags page lists these (Note: This is not an authoritative source, but still useful):
'gshh' GoogleHostHeader string
'gspm' GooglePingMessage string
'gspu' GooglePingURL string
'gssd' GoogleSourceData string
'gsst' GoogleStartTime string
'gstd' GoogleTrackDuration string
Cheers.
The code above seems to work. The names are added to the input box when you select them and removed from the input box when deselected (see demo below)
Can you explain in more detail what the issue is?
$('input:checkbox').change((e) => {
if ($(e.currentTarget).is(':checked')) {
var curVal = $('#name').val();
if (curVal) {
$('#name').val(curVal + ', ' + e.currentTarget.value);
} else {
$('#name').val(e.currentTarget.value);
}
} else if (!($(e.currentTarget).is(':checked'))) {
var curVal = $('#name').val().split(',');
var filteredVal = curVal.filter(el => el.trim() !== e.currentTarget.value)
$('#name').val(filteredVal.join(','));
}
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.7.1/jquery.min.js"></script>
<input type="text" id="name"></input>
<br/>
<input type="checkbox" value="Alabama">Alabama<br/>
<input type="checkbox" value="Alaska">Alaska<br/>
<input type="checkbox" value="Arkansa">Arkansa<br/>
A wrapper around james-heinrich/getid3 to extract various information from media files.
If you want the app to display the permission pop up again, the only way to do this is to uninstall the Expo Go app and reinstall it.
Here's a link to the Expo docs on this topic:
Disabling the permissions via the settings does undo the permission, but it will not lead to the app displaying the pop up again.
Apparently, this is not a bug or limitation, but an OS-level restriction imposed by both Android and iOS.
The custom use-sound hook doesn't suffer from this problem in Safari. Made my day.
What did you end up doing?????
From libGDX https://github.com/libgdx/Jamepad works on all desktop platforms
Here is a function for splitting a list into 2 parts with random numbers.
lst = [1, 2, 3, 4, 5, 6]
def split(lst,x):
split=lst[:x]
return split
def rest_lst(lst,x):
rest_lst= lst[x::]
return rest_lst
print(split(lst,2))
print(rest_lst(lst,2))
The output:
[1,2]
[3,4,5,6]
You can try optimising html2canvas settings to Lower image scale (e.g., window.devicePixelRatio / 2). The second option would be to try an alternative like dom-to-image
There's a python tool that does this for you. Maybe run it against your pyproject.toml and see what changes it makes? |
Me too. XCode 16.1. All simulators fail to sync iCloud Data using iOS 18.1 iOS 18 is OK. Bug report about to go in. I'm getting a COre Data error "Failed user key sync". This has been an issue in previous XCode versions too but had been fixed.
For me, none of here mentioned solutions ever worked. I had to install php-fpm in the nginx container and use its unix (file) socket instead of tcp socket to fpm container.
As an alternative solution, I like the answer I got from here. In it, the secondary find box (CtrlShiftF) on the sidebar can be moved to the bottom panel.
Bash command to detect PostgreSQL synchronous standby by Patroni API and jq
/usr/pgsql-16/bin/pg_isready -q && \
test "$(hostname -I | xargs):5432" == "$(curl -s localhost:8008/cluster | jq -Mcr '.members[] | select (.role == "sync_standby") | .host + ":" + (.port|tostring)')"
To do the toSignal update or refresh, you could use the BehaviorSubject.
This code
product.service.ts
================================
public product_url = "http://127.0.0.1:8000/api/notes"
getProducts_(): Observable<any> {
return this.apiService.get(this.product_url);
}
product-list.component.ts
================================
private refresh$ = new BehaviorSubject<void>(undefined);
dataList$=this.refresh$.pipe(switchMap(()=>this.productService.getProducts_()))
dataSignal = toSignal(this.dataList$)
refreshAdd() {
this.refresh$.next();
}
<table class="table table-hover">
<thead>
<tr>
<th scope="col">#</th>
<th scope="col">Product name</th>
</tr>
</thead>
<tbody>
<tr *ngFor="let product_data of dataSignal(); let i = index">
<th scope="row">{{i+1}}</th>
<td>{{product_data.title}}</td>
</tr>
</tbody>
</table>
Even though the answer from Simon Dold works fine, the better solution compatible with typescript would be to map a Sequelize Model instance to JSON serialisible object, like
const user = await User.findOne({where: {id}}).then(user=>({id: user.id, name: user.name}))
No. You can also use a .def
file to specify exported functions, even if that's not as common.
Bash command to detect PostgreSQL synchronous standby by Patroni API and jq
/usr/pgsql-16/bin/pg_isready -q && \
test "$(hostname -I | xargs):5432" == "$(curl -s localhost:8008/cluster | jq -Mcr '.members[] | select (.role == "sync_standby") | .host + ":" + (.port|tostring)')"
Use BoxScope
.
When BoxScope
is used, it defines the scope of the associated composable.
content: @Composable BoxScope.() -> Unit,
@Jonathan Borrelli I have the same problem with SDL3-3.1.6. Can you tell me how you installed the SDL3_Image library?
Add these lines in the initial setup to create the repository.
git init my_repo
cd my_repo
Then, run the tooling script.
I have problem connecting to Bitbucket from SourceTree using SSH also. Be aware that for Bitbucket the answer from Shrava40 is not correct according to documenttation - https://support.atlassian.com/bitbucket-cloud/docs/configure-ssh-and-two-step-verification/ where they say only key lenght 2048 is supported for rsa key type.
(simplified) In the instance of OpenGl the library serves two purposes.
In the process of accomplishing these goals, the graphics libraries get useful features built in that make it easy to experiment with your hardware.
When you make an openGL program, your mainly focused on programming shaders in the GLSL language and seamlessly compiling and running said shaders via the C language to invoke the graphical results you were looking for.
There's entire books on the theory behind graphical programming. Libraries, such as OpenGL, provide programming interfaces for implementing these theories. (EI: 2D and 3D visualization concepts)
You can access old versions of android studio from this link
I use Eclipse and build UI front-ends with WindowBuilder in the environment. I got interested in IntelliJ (why? - just curious) and downloaded it. There was no clue onboard about building a UI in it. After hours of only finding YouTube videos on coding your own UI from scratch while using IntelliJ, I came to realize it is TOTALLY USELESS; the IntelliJ IDE is incapable of using a UI builder like WindowBuilder.
Result: Removed the IntelliJ application from my several computers and apologized to Oracle and Eclipse.
I found a solution. Both Get Playlist Items and Get Playlist respond with a next
variable, which is the link to the next 100 tracks of the playlist, or any tracks that are left if not 100, or it gets the value None
when there are no more left. In the first case, is it visible immediately in the response, while in the latter, it is nested under the tracks object (tracks.next
).
I got all tracks by iteratively getting responded tracks until next
was None
, and using every time the URL returned in the response .
Cheers!
As stated on the Kiwi.com media blog, from 30 May 2024 the API will only work for B2B partners. It is no longer even possible to register new accounts, unless you are chosen as an affiliate partner. You can read the post here. Log in with your credentials and check if you have access to the B2B portal.
@Rajesh Kumar Sahoo I think you should use something like this (using the code example above):
Public Sub SayHello(ByVal name As String)
Call YourMacroName
End Sub
This error will happen if you upgrade firebase-functions
, without upgrading your tooling.
In order to fix it, just run npm i -g firebase-tools
From the plugin page: https://plugins.jetbrains.com/plugin/13882-godot-support
Starting with Rider 2024.2, this plugin is now bundled with the software.
As a result, no further updates will be published on the Marketplace.
Adds support for the Godot engine.
:, you must provide your AWS account ID, not directly like this. <>, this keyword means, you have to replace it with the original one. Also, Diff only shows you the difference between the previous stage and the new stage, not for deployment.
You can resolve this memory issue by using the
XMLReader
PHP class instead of
DOMDocument
If you're using the Magento framework, you can use
Laminas\Config\Reader\Xml
instead of
Magento\Framework\Xml\Parser
class.
DOMDocument class will try to load all the XML content to memory at once but XMLReader class will NOT load the content to memory at once. It will read the nodes set by set incrementally.
You can find more info on this one in this article https://medium.com/devops-dev/way-to-read-large-xml-dataset-in-magento-2-using-laminas-config-reader-xml-2b59c936bbcc :)
Cheers!
Fun problem!
I made some slight changes. For example, I added data %>%
to code1
and code2
because I can't remember how to concatenate that programmatically. I prefer rlang
to base R's metaprogramming. You can read the metaprogramming section of Advanced R for a deep dive or this PDF cheat sheet for a quick intro.
I used dplyr::enexpr
, replacing eval(substitute(...))
. You also mistyped some of the column names, e.g. states
which should have been state
.
Lastly, I used rlang::eval_tidy
to run the code.
I haven't checked that the output is correct, however. I'll leave that to you, and please let me know!
apply_conditional_summary <- function(data, code1, code2) {
if ("category" %in% colnames(data)) {
result <- enexpr(code2)
} else {
result <- enexpr(code1)
}
return(rlang::eval_tidy(result))
}
apply_conditional_summary(
df2,
data %>% group_by(date, state) %>% summarise(units = mean(units), total_amt = sum(amount), .groups = "drop"),
data %>% group_by(date, state, category) %>%
summarise(units = mean(units), total_amt = sum(amount), .groups = "drop") %>%
group_by(date, state) %>%
mutate(share = total_amt / sum(total_amt)) %>%
ungroup()
)
For other lost peeps like me, the thing missing in OP's workflow is the checkout step. I was getting the same "empty archive" until I added the always necessary uses: actions/checkout@v4
Could it just be an error of uploading artifacts to Maven Central?
There is only a sudden "Drools 9.44.x" archive with nothing preceding it in the "9 major" release, and its date is exactly the one of "Drools 8.44.0 Final": Sep 06, 2023
According to documentation
report_to (str or List[str], optional, defaults to "all")
Try to pass
report_to="none"
heap = []
for num in nums:
heapq.heappush(heap, num)
can be reduced to below code, will have O(N) complexity
heapq.heapify(nums)
while this given code -
for _ in xrange(len(nums)-k):
heapq.heappop(heap)
return heapq.heappop(heap)
can be reduced to below code, will have complexity O(log(n-k))
return heapq.nsmallest(n-k, nums)[0]
For more info on what's the complexity of each function, read - https://dpythoncodenemesis.medium.com/understanding-pythons-heapq-module-a-guide-to-heap-queues-cfded4e7dfca
https://dev.mysql.com/doc/refman/en/innodb-transaction-isolation-levels.html has the answer:
Consistent reads within the same transaction read the snapshot established by the first read.
Hello I had the same problem and I did not work as discussed in this post and was using the MAMP service, but then I noticed that at the bottom of the local page is the username and password in my case for both was root. Image
If anyone of you are still alive can you please help me. A have a simmilar problem but instead of 3i+1 a have 1/1+r
In the following line: !!sqlcmd /E /S$(SECONDARY) -i DRRestoreDatabase.sql -v BKDIR="$(SBKSHARE)" -v DATADIR="$(SDATADIR)" -v LOGDIR="$(SLOGDIR)"
Where should I get the script "DRRestoreDatabase.sql"?
Not sure if this is what you are looking for or not...
Whenever I have created code to create and send emails with the methods I have used there is generally a .HTMLBody property to add the html to rather than adding it to .Body
I solved it now just by using codesign --force --deep --sign - binaryname
Thanks for sharing your knowledge! It helped me a lot! From: Brazil :)