It's an old post but if you can do that, then Google, Microsoft or any large servers in the world can just crash by one client. And that's not how the Internet works! By requesting the server to send a resource, you - as the client, receives it chunk by chunk. And if you want to only request but not receive the byte, then by the time the server knew it send data to nowhere, it stops. Think of it as a electric wire, it allows electric to flow, right? If you cut the wire or connect the endpoint to nowhere then the electric is nowhere to flow.
One thing you can do is code some software and send it to people all over the world, the software you make will target specific website or server you want, then that's called DDoS and you've just made an malware! Those people installing your malware and turn their PC into a zombie machine sending request to your targeting server. Fulfilling large amount of request from all over the world make the server overload, and then shut down.
After all. What you've asking for is disgusting. It show no respect to the development world where it need to improve, not harm anybody. And for that reason I'm going to flag this post. Sorry.
prefer this document , this worked for me...
https://blog.devgenius.io/apache-superset-integration-with-keycloak-3571123e0acf
Just for future reference, another way to achieve the same (which is mpv specific) is
play-music()
{
local selected=($(find "${PWD}" -mindepth 1 -maxdepth 1 -type f -iname "*.mp3" | LC_COLLATE=C sort | fzf))
mpv --playlist=<(printf "%s\n" "${selected[@]}") </dev/null &>/dev/null & disown
}
I had the same issue in previous version and found that some of my add-ons were bugging out some of the keyboard shortcuts. I set the add-ons to turn only when needed and reset my keyboard shortcuts from the Tool menu.
In iOS 18 and later, an official API has been added to retrieve the tag value, making it easier to implement custom pickers and similar components.
https://developer.apple.com/documentation/swiftui/containervalues/tag(for:)
func tag<V>(for type: V.Type) -> V? where V : Hashable
func hasTag<V>(_ tag: V) -> Bool where V : Hashable
Since it is a batch job, consider using GitHub actions from job that will run on a schedule. Use the bq utility.
Xo Jeeva
| header 1 | header 2 |
|---|---|
| cell 1 | cell 2 |
| cell 3 | cell 4 |
Xo Jeeva
Just add this to your pubspec.yaml:
dependency_overrides:
video_player_android:
git:
url: https://github.com/dennisstromberg/video_player_android.git
This solved the problem for me
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
Depending on the number of occurrences you have for each key you can modify this JCL to meet your requirements
Mainframe Sort JCL - transpose the rows to column
thank you all, i had forgot to replace myproject.pdb.json on the server
there was the same character string in both 'bunchofstuff' and 'onlypartofstuff', so i used a rewrite rule where, if, say 'stuff' was in the url, nothing was done, otherwise, forward to the 'https://abc.123.whatever/bunchofstuff' url. one caveat is: if someone typed in the full url for 'bunchofstuff' or 'onlypartofstuff', the rule doesn't check for 'https' ... but, i don't think anyone would type in one of those longer url's by hand (they'd use a link w/ 'https' in it). but the main abc.123.whatever will forward.
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
# Simulated price data (or replace with real futures data)
np.random.seed(42)
dates = pd.date_range(start='2010-01-01', periods=3000)
price = 100 + np.cumsum(np.random.normal(0, 1, size=len(dates)))
df = pd.DataFrame(data={'Close': price}, index=dates)
# Pure technical indicators: moving averages
df['SMA_50'] = df['Close'].rolling(window=50).mean()
df['SMA_200'] = df['Close'].rolling(window=200).mean()
# Strategy: Go long when 50 SMA crosses above 200 SMA (golden cross), exit when below
df['Position'] = 0
df['Position'][df['SMA_50'] > df['SMA_200']] = 1
df['Position'] = df['Position'].shift(1) # Avoid lookahead bias
# Calculate returns
df['Return'] = df['Close'].pct_change()
df['Strategy_Return'] = df['Return'] * df['Position']
# Performance
cumulative_strategy = (1 + df['Strategy_Return']).cumprod()
cumulative_buy_hold = (1 + df['Return']).cumprod()
# Plotting
plt.figure(figsize=(12, 6))
plt.plot(cumulative_strategy, label='Strategy (Technical Only)')
plt.plot(cumulative_buy_hold, label='Buy & Hold', linestyle='--')
plt.title('Technical Strategy vs Buy & Hold')
plt.legend()
plt.grid(True)
plt.tight_layout()
plt.show()
I can't speak for ACF, but both BoxLang and Lucee use the same codepaths under the hood for executing your query whether you use a tag/component or a BIF.
Here's the relevant files in BoxLang: (both use PendingQuery)
pip install webdrivermanager
webdrivermanager chrome --linkpath /usr/local/bin
(check if chrome driver is in PATH)
I was about to comment that I had the same issue since a few weeks (Chrome on Windows 11), but when I opened my Chrome settings to specify my version (which was 137), chrome auto-updated. Once relaunched with version 138, Google voices started working again.
La solución es cambiar el AspNetCoreHostingModel de todas las aplicaciones a OutOfProcess en el webconfig
<AspNetCoreHostingModel>OutOfProcess</AspNetCoreHostingModel>
Bigtable's SQL support is now in GA and supports server-side aggregations. In addition to read-time aggregations it is also possible to define incremental materialized views that do rollups/aggregations at ingest-time.
In a lot of real-time analytics applications, one can imagine stacking these e.g. use an incremental materialized view to aggregate clicks into 15 minute windows and then at read time apply filters and GROUP BY to convert pre-aggregated data into a coarser granularity like N hours, days etc.
Here’s what I’d try:
Instead of blocking on city OR postal code, block on city AND the first few digits of postal code — this cuts down your candidate pairs a ton.
Convert city and postal code columns to categorical types to speed up equality checks and save memory.
Instead of doing fuzzy matching row-by-row with apply(), try using Rapidfuzz’s batch functions to vectorize street name similarity.
Keep your early stopping logic, but order components by weight so you can bail out faster.
Increase the number of Dask partitions (like 500+) and if possible run on a distributed cluster for better parallelism.
I have this problem too. There may be alternative ways but why is this not working?
It's better to have server but if you don't have any server you can you HOST.
I was doing a refresher on web application security recently and also asked myself the same question, and became pretty annoyed while trying to understand the answer because even the literature itself seems to mix the mathematical theory with real life implementations.
The top-voted answer explains in length but for some reason fails to state it plainly, so I'd like to make a small addition for any applications developer that stumbles upon this. It obvious enough that we should not use private and shared keys interchangeably, as their names suggest. The question here is: the literature definition of private and public key pairs state that:
Only the private key can decrypt a message encrypted with the public key
Only the public key can decrypt a message encrypted with the private key
Why then, can't one be used in place of the other? Which is a completely legitimate question if you take real-world application out of the question, which literature also often tends to do.
The simple answer pertains to the actual implementations of the exchange algorithm in question, i.e. RSA, and is that the public key can be extracted from the private key contents.
The reason for that is when generating a private key file with RSA using pretty much any known tool in actual practice the resulting file contains both exponents, and therefore, both keys. In fact, when using the openssl or ssh-keygen tool, the public key can be re-extracted from the original private key contents at any time: https://stackoverflow.com/a/5246045
Conceptually, neither of the exponents are mathematically "private" or "public", those are just labels assigned upon creation and could easily be assigned in reverse, and deriving one exponent from the other is an equivalent problem from both perspectives. In that sense
Tl;dr *private keys and shared keys are not interchangeable and you must be a good boy and host your private key only on the server/entity that needs to be authenticated by someone else, *and it's equally important to tell you that you should wear a seatbelt while driving your car. The reason why is because generally the private key contents hold information for both keys, and the shared key can be extracted from that. That holds up for pretty much any tool that implements RSA exchange.
Include windows.hand add this code to the start of your program (within main()) to get the virtual terminal codes to work in CMD.EXE:
HANDLE hOut = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD dwMode = 0;
GetConsoleMode(hOut, &dwMode);
dwMode |= ENABLE_VIRTUAL_TERMINAL_PROCESSING;
SetConsoleMode(hOut, dwMode);
] should be placed directly after ^ ! Hence, [^][,] is correct; while [^[,]] is incorrect.
Do you mean button like that?
you can use
composeview.addButton(buttonDescriptor)
InboxSDK.load(1, 'YOUR_APP_ID_HERE').then(function(sdk){
sdk.Compose.registerComposeViewHandler(function(composeView){
composeview.addButton({
title: "title",
onClik: ()=>{console.log("clicked");}
})
});
});
In VS Code, when you hover on the args, small popup will open. it will give you option for edit, clear or clearall.
realloc() is not intended to re-allocate memory of stack variables (local variables of a function). This might seem trivial but is a fruitful source of accumulating memory bugs. Hence,
uint64_t* memory = NULL;
should be a heap allocation:
uint64_t *memory = (uint64_t *) malloc(sizeof(uint64_t));
import json
import snowflake.connector
def load_config(file_path):
"""Load the Snowflake account configuration from a JSON file."""
with open(file_path, 'r') as file:
return json.load(file)
def connect_to_snowflake(account, username, password, role):
"""Establish a connection to a Snowflake account."""
try:
conn = snowflake.connector.connect(
user=username,
password=password,
account=account,
role=role
)
return conn
except Exception as e:
print(f"Error connecting to Snowflake: {e}")
return None
def fetch_tags(conn):
"""Fetch tag_list and tag_defn from the Snowflake account."""
cursor = conn.cursor()
try:
cursor.execute("""
SELECT tag_list, tag_defn
FROM platform_common.tags;
""")
return cursor.fetchall() # Return all rows from the query
finally:
cursor.close()
def generate_sql_statements(source_tags, target_tags):
"""Generate SQL statements based on the differences in tag_list values."""
sql_statements = []
# Create a set for target tag_list values for easy lookup
target_tags_set = {tag[0]: tag[1] for tag in target_tags} # {tag_list: tag_defn}
# Check for new tags in source that are not in target
for tag in source_tags:
tag_list = tag[0]
tag_defn = tag[1]
if tag_list not in target_tags_set:
# Create statement for the new tag
create_statement = f"INSERT INTO platform_common.tags (tag_list, tag_defn) VALUES ('{tag_sql_statements.append(create_statement)
return sql_statements
def write_output_file(statements, output_file):
"""Write the generated SQL statements to an output file."""
with open(output_file, 'w') as file:
for statement in statements:
file.write(statement + '\n')
def main():
# Load configuration from JSON file
config = load_config('snowflake_config.json')
# Connect to source Snowflake account
source_conn = connect_to_snowflake(
config['source']['account'],
config['source']['username'],
config['source']['password'],
config['source']['role']
)
# Connect to target Snowflake account
target_conn = connect_to_snowflake(
config['target']['account'],
config['target']['username'],
config['target']['password'],
config['target']['role']
)
if source_conn and target_conn:
# Fetch tags from both accounts
source_tags = fetch_tags(source_conn)
target_tags = fetch_tags(target_conn)
# Generate SQL statements based on the comparison
sql_statements = generate_sql_statements(source_tags, target_tags)
# Write the output to a file
write_output_file(sql_statements, 'execution_plan.sql')
print("Execution plan has been generated in 'execution_plan.sql'.")
# Close connections
if source_conn:
source_conn.close()
if target_conn:
target_conn.close()
if _name_ == "_main_":
main()
You can adjust the "window.zoomLevel" in your settings.json file to increase or decrease the font size of the sidebar. To change the font size in the text editor, use "editor.fontSize", and for the terminal, use "terminal.integrated.fontSize".
Once you find the right balance, it should significantly improve your comfort while working.
That is, unless you’re aiming to style a specific tab in the sidebar individually.
What helped for me:
I had pyright installed, so i opened settings by pressing command+, typed @ext:anysphere.cursorpyright, found Cursorpyright › Analysis: Type Checking Mode and changed from "basic" to "off".
I spent 4 hours try to configure php.ini and no result
In my case the issue was Avast
disable it and it works fine
You need to create a custom Protocol Mapper in Keycloak to programmatically set the userId value before the token is generated This guide may help you get started and give you a clear idea of the implementation process:
Looks like you sending some http statuses (via completeWithError) when there are some data already writed into sse stream (http body).
Did you manage to solve this?
Better late than never, you might wanna try this one PHP routes
Git 2.49.0-rc0 finally added the --revision option to git clone https://github.com/git/git/commit/337855629f59a3f435dabef900e22202ce8e00e1
git clone --revision=<commit-ish> $OPTIONS $URL
I’ve faced a similar issue while working on a WooCommerce-based WordPress site for one of our clients recently. The WYSIWYG editor (TinyMCE) stopped loading properly, especially in the Product Description field, and we got console errors just like yours.
Here are a few things you can try:
1. Disable All Plugins Temporarily
The error Cannot read properties of undefined (reading 'wcBlocksRegistry') is often related to a conflict with the WooCommerce Blocks or another plugin that’s hooking into the editor.
Go to your plugins list and temporarily deactivate all plugins except WooCommerce.
Then check if the editor loads correctly.
If it does, reactivate each plugin one by one to identify the culprit.
2. Switch to a Default Theme
Sometimes the theme might enqueue scripts that interfere with the block editor. Try switching to a default WordPress theme like Twenty Twenty-Four to rule that out.
3. Clear Browser & Site Cache
This issue can also be caused by cached JavaScript files:
Clear your browser cache
If you're using a caching plugin or CDN (like Cloudflare), purge the cache
4. Reinstall the Classic Editor or Disable Gutenberg (Temporarily)
If you're using a classic setup and don't need Gutenberg, install the Classic Editor plugin and see if that resolves the issue. It can bypass block editor conflicts temporarily.
5. Check for Console Errors on Plugin Pages
Go to WooCommerce > Status > Logs to see if anything unusual is logged when the editor fails to load.
6. Update Everything
Ensure:
WordPress core
WooCommerce
All plugins & themes
...are fully updated. These kinds of undefined JavaScript errors are often fixed in plugin updates.
Let me know what worked — happy to help further if you're still stuck. We had a very similar case at our agency (Digital4Design), and in our case, it was a conflict between an outdated Gutenberg add-on and a WooCommerce update.
For those using Apple or Linux
JAVA_HOME=$(readlink -f "$(which java)" | sed 's#/bin/java##')
CREATE OR REPLACE PROCEDURE platform_common.tags.store_tags()
RETURNS STRING
LANGUAGE SQL
AS
$$
BEGIN
-- Create or replace the table to store the tags
CREATE OR REPLACE TABLE platform_common.tags (
database_name STRING,
schema_name STRING,
tag_name STRING,
comment STRING,
allowed_values STRING,
propagate STRING
);
-- Execute the SHOW TAGS command and store the result
EXECUTE IMMEDIATE 'SHOW TAGS IN ACCOUNT';
-- Insert the results into the tags table
INSERT INTO platform_common.tags (database_name, schema_name, tag_name, comment, allowed_values, SELECT
"database_name",
"schema_name",
"name" AS "tag_name",
"comment",
"allowed_values",
"propagate"
FROM
TABLE(RESULT_SCAN(LAST_QUERY_ID()))
WHERE
"database_name" != 'SNOWFLAKE'
ORDER BY
"created_on";
RETURN 'Tags stored successfully in platform_common.tags';
END;
$$;
Replace
- export HOST_PROJECT_PATH=/home/project/myproject
for
- export HOST_PROJECT_PATH=${BITBUCKET_CLONE_DIR}
I found the problem and would like to share it.
It is possible to save a task without assigning it to a user
In the update function i make this:
$user = User::findOrFail($validated['user_id']);
$customer = Customer::findOrFail($validated['customer_id']);
I finally figured it out. The main issue was that I initially linked my new External ID tenant to an existing subscription that was still associated with my home directory, which caused problems.
To resolve it, I created a new subscription and made sure to assign it directly to the new tenant / directory.
After that, I was able to switch directories again — and this time, MFA worked as expected, and I successfully switched tenants.
Additionally, I now see that I’m assigned the Global Administrator role by default in the new tenant, just as expected and as confirmed in the Microsoft Docs
By default, the user who creates a Microsoft Entra tenant is automatically assigned the Global Administrator role.
In my opinion, a more effective approach would be to interpret a fixed point as a separator between high and low bits. In this case, the scaling becomes arithmetic shifting. For example, decimal float number 5.25 = 101.01 in binary representation.
cpp code for transposing matrix without using new matrix
for(int i=0;i<arr.size();i++){
for(int j=0;j<i;j++){
swap(arr[i][j],arr[j][i]);
}
}
its resolved ? i am facing same issue
Did you add the description field later? Your code looks good actually.
python manage.py search_index --delete -f
python manage.py search_index --create
python manage.py search_index --populate
I found the problem. When we works in the Timer4 interrupt at one point we need to turn on the tim4int using the EIE2 register's thirt bit, so I did EIE2 &= 0x08; instead of EIE2 |= 0x08; and that causes to turn off the tim3int because first bit of EIE2 enables tim3int. Thank you...
Replacement of void by Task helps. But it takes long time to find it out after trying everything ...
The label looks off because you're using a small font.
InputLabelProps: {
sx: {
fontSize: '13px',
top: '50%',
transform: 'translateY(-50%)',
'&.MuiInputLabel-shrink': {
top: 0,
transform: 'translateY(0)',
},
},
},
I think I've found the problem. In view.py, for each request, I create a grpc channel. A channel takes time to connect to the server. I think if I send grpc request while the channel is not connected to the server, this error will happen. The code is under development. I will change the view.py to reuse the grpc channel. After that, if the error persists, I will use your suggestion and I will inform you about the result. thanks.
The Problem is that Users like "Everyone" or "Users" do not exist in a geman Windows installation.
they are called "jeder" and "Benutzer"
So there must be a generic way wich i thoght is:
User="[WIX_ACCOUNT_USERS]"
But i can not get it to wrk on Wix 6.01
R is like you are setting the ratings for a show yes it is then you find in the text why you think you can buy what you want then you create an API for the demo then you get something usely in a scrip be a man while driving ford then you know the man is normal but the truck is not so if you complete that and make it rated r you win the truck
In atomics it would be devastating for the guy to be normal and a big truck to talk to ... So you don't do the ratings and you get hershoma i.e to degliate not go off and be in bit pieces .
You get command runs on your phone talking to you ? Well that's my robot talking to you defending it's self I'll have to talk with it before making any more assumptions... Haha it told you what we were doing is rated R.
just update VS to the latest build or use the workaround mentioned in the issue
Try to update node-abi:
npm install node-abi@latest
The newer version of node-abi includes support for Electron 37.1.0.
I have found that in Solution Nuget packages
I managed to find to solve the issue by knowing exactly the number of differences in the first few lines and counting onwards from there.
diff a.txt b.txt &> log.log
wc -l <"log.log") != <known number of differences>
Delete python path from enviroment variables. And then install python again. It will work.
Experience a Smarter Way to Manage Your Fleet
Fleetblox Cloud Garage is compatible with 43 car makes, seamlessly connecting to over 177 million vehicles through a single platform. With global coverage across North America and Europe, our advanced AI-driven solution optimizes fleet management, ensuring maximum operational efficiency and streamlined performance—all in ...see more in https://www.fleetblox.com
On Ubuntu this is enough:
sudo apt-get install libxml2-dev libxslt-dev
Also note, that python-dev in documentation refers to Python 2 and not needed for Python 3 installations.
I suggest to use Langgraph Studio https://langchain-ai.github.io/langgraph/concepts/langgraph_studio/.
It can show you the graph as you work on it and you can run live tests in graph mode and in chat mode, it's also integrated with LangSmith. Pretty useful for agent development.
Make you Dto a value class and it won't be part of the serialized object.
It seems this was user error (kindof). In figma I have a layer that is used as a mask (to clip the svg). Every other program doesnt have an issue with it having a color, but Godot 4.4 seems to apply it anyway.
as you can see, the Mask has a color and a blend mode, and that gets applied in Godot, making it darker. If I set the "Fill" to white, everything looks ok.
According to the information here with Vuforia 11.3 you need Unity 6 (6.0.23f1+)
Use a CTE to evaluate the case, then the main query can filter -
with x as (select ... case this, case that ... from tbls)
select <whatever> from x where <your-filters>;
for this purpose, using a custom dataset is the recommended approach. You can follow this guide:
https://docs.kedro.org/en/0.19.14/data/how_to_create_a_custom_dataset.html#advanced-tutorial-to-create-a-custom-dataset
The implementation should be straightforward - you mainly need to define the _save method like this:
class YourModelDataset(AbstractDataset):
def __init__(self, filepath: str):
self._filepath = filepath
def _save(self, model: PipelineModel) -> None:
model.save(self._filepath)
Once defined, just register the dataset in your catalog.yml with the appropriate type and filepath.
Deleting my simulator device in Android Studio and then installing a new one worked for me.
Upgrade to - it should work
<dependency>
<groupId>io.appium</groupId>
<artifactId>java-client</artifactId>
<version>9.5.0</version>
</dependency>
Google TV apps were build on Android TV SDK, which is Java/Kotlin-based. Whereas, Samsung uses Tizen (HTML5/JS) and LG runs on webOS (also HTML5/JS). So, you have to build the separate codebases for each platform. Unfortunately, the Google TV guidelines and UI components won't directly translate to Tizen or webOS due to different runtime environments, design standards, and APIs.
However the good news is:
Your backend logic (APIs, video streams, etc.) can remain the same across platforms.
You can follow a modular approach separating frontend UI logic from core business logic to minimize duplication.
There are some cross-platform frameworks (like React Native for TV or Flutter with custom rendering) but support is limited and usually not production-ready for Samsung/LG.
If you're looking for scale and faster deployment, many businesses go with white-label solutions like VPlayed**, Zapp, or Accedo**, which offer multi-platform Smart TV apps with a unified backend and consistent UX. For more details on accessing a while label cloud tv platform checkout: https://www.vplayed.com/cloud-tv-platform.php
In short yes, separate codebases are required but the strategy you use can save you a lot of time in the long run.
There is option in Postmant now that allows reverting changes in any collection, thus in fork collection as well.
Just click on the collection and in the right side panel click history. Choose where you want to restore changes from.
Done!
It is a bit slow so give it some time to reflect changes.
Ultimately this looks like an opinion based question, I will give you my two cents:
Option 1:
The server application will have to create a framework to remember the values written for each "command" if that is necessary. "Command" implies that writing on a particular command will end up performing certain procedure, and nothing too complicated, in that case Option 1 is a good option. If however, you need to remember the values written for each command (which is implied since you want to support read), then you will have to write code that is already written by whichever BLE stack you are using.
Option 2:
This looks good if you already know the number of commands you will support, hence the number of characteristics you will have, there is an overhead of discovery involved. The client will have to discover all characteristics at the beginning and remember which handle belongs to which UUID and hence which command. This, while involving some extra complication, will provide more flexibility to specify permissions for each command. You could have a command require encryption/authentication/authorization to write to, while keeping other commands with different permissions. You could also have different write properties, commands that only accept read or write, notifications/indications independently for each characteristic, size control for commands that will always have a particular size, better error code responses, etc.
If the requirements are only and exactly as you specified, then both options are fine, you could probably toss a coin and let fate decide, note that my recommendation is the second option.
If there is a possibility to extend functionalities in the future, I will only recommend the second option.
Did you manage to solve this? I had to the do the following which forced the connection to be Microsoft.Data.SqlClient by creating it myself. I also pass in true to the base constructor to let the context dispose the connection after.
using System;
using System.Data.Common;
using System.Data.Entity;
using Microsoft.AspNet.Identity.EntityFramework;
using Microsoft.Data.SqlClient;
using Mintsoft.LaunchDarkly;
using Mintsoft.UserManagement.Membership;
namespace Mintsoft.UserManagement
{
public class UserDbContext : IdentityDbContext<ApplicationUser>
{
private static DbConnection GetBaseDbConnection()
{
var connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["BaseContext"].ConnectionString;
return new SqlConnection(connectionString);
}
public UserDbContext()
: base(GetBaseDbConnection(), true)
{
}
Yes, there is a way. Described here.
But...what utilizes ForkJoinPool?
ForkJoinPool is designed to split task into smaller tasks and reduce the results of computations in the end (e.g. collections parallel stream)
So If it's not suitable for your case, it would be advisable to use another executorservice.
The view point i can add in solving the problem above is that The key insight from this case is that Flask-Alembic migration errors often mask the real problem: code that accesses the database during module import, before the database is properly initialized. This works locally because tables exist, but fails on fresh deployments.
It's strange that this error happens when you do full reload 🤔.
I believe the problem you are having is due to partial page navigation. This is when SharePoint refreshes just part of the page not the whole website and it means many parts like oninit on useEffect on first run will not rerun and due to that the sp.web may still be relying on an outdated or uninitialized context.
This may happen when you initialize sp.setup({...}) only once, say in onInit, and it’s not updated on subsequent navigations.
You may find more context in this article.
https://www.blimped.nl/spfx-and-the-potential-pitfall-of-partial-page-navigation/
Hope what I suspect is correct and my comment will help you get unblocked 🤞🤞
Happy Coding!
Personally i developed and use Evernox to manage my databases visually.
In Evernox you can create revisions for each version of your database/diagram.
From these revisions you can automatically generate migrations for different DBMS like postgres, MySql, BigQuery.
It's really convenient to manage your database in a visual diagram editor and then just create revisions for each version, where you can visually inspect the differences
Here is an example. will work on any platform
https://www.bleuio.com/blog/ble-usb-dongle-throughput-measurement/
I had the same issue in production, just added this line to my `.env' file:
ASSET_URL=https://example.com
set your asset address with https at the value.
like what this article in stackoverflow
What I needed to do was:
SwaggerConfiguration oasConfig = new SwaggerConfiguration()
.openApi(baseModel)
.readerClass(MyCustomReader.class.getName()) // <- HERE
.prettyPrint(true)
.resourcePackages(Set.of("my.api.package"));
io.swagger.v3.jaxrs2.Reader; and overriding the read methodpublic class MyCustomReader extends Reader {
@Override
public OpenAPI read(Class<?> cls,
String parentPath,
String parentMethod,
boolean isSubresource,
RequestBody parentRequestBody,
ApiResponses apiResponses,
Set<String> parentTags,
List<Parameter> parentParameters,
Set<Class<?>> scannedResources)
{
OpenAPI openAPI = super.read(cls, parentPath, parentMethod, isSubresource, parentRequestBody, apiResponses,
parentTags, parentParameters, scannedResources);
return populateMissingDescriptions(openAPI);
}
public static OpenAPI populateMissingDescriptions(OpenAPI openAPI) {
if (openAPI.getPaths() != null) {
for (Entry<String, PathItem> entry : openAPI.getPaths().entrySet()) {
String path = entry.getKey();
PathItem pathItem = entry.getValue();
for (Operation operation : pathItem.readOperations()) {
if (operation.getResponses() != null) {
for (Entry<String, ApiResponse> e : operation.getResponses().entrySet()) {
String status = e.getKey();
ApiResponse response = e.getValue();
if (response.getDescription() == null) {
response.setDescription("");
}
}
}
}
}
}
return openAPI;
}
}
For logback, add %ms for microseconds.
%d{yyyy-MM-dd'T'HH:mm:ss.SSS}%ms
ngrok-skip-browser-warning: true using ModHeaders extension
I still can't figure it out. Can anyone help me?
i find your method very interesting , however in a case where you don't know anything about your data, meaning you don't know the optimal distance threshold , how would you encounter the problem ( also you can't choose it based on intuition with no profound study , since you can have a lot of clusters/sets of points with big variety of deviations , very dense ones vs very sparse ones ) ? My goal is: given a numerical variable ( 1 dimension only ) , do an agglomerative clustering and get the optimal number of clusters to do a binning just after that and transform it to a categorical one , I'm thinking about using some indicators of information purity in order to measure the impact of each merge, but i haven't figured it out yet.
**11*
Name="flex items-center space-x-4 mb-4">
<Badge className={`${currentLevel.color} font-semibold`}>
{currentLevel.name}
</Badge>
<div className="flex items-center space-x-1">
<Star className="w-4 h-4" />
<span>{profile?.points || 0} Points</span>
</div>
</div>
<div className="space-y-2">
<div className="flex justify-between text-sm">
<span>Progress to {currentLevel.name === "Master" ? "Max Level" : "Next Level"}</span>
<span>{profile?.points || 0} / {nextLevelPoints}</span>
</div>
<Progress value={progressPercent} className="bg-white/20" />
</div>
</div>
</div>
</CardContent>
</Card>
{/* Stats Grid */}
<div className="grid md:grid-cols-4 gap-6">
<Card>
<CardContent className="p-6 text-center">
<Award className="w-8 h-8 text-blue-500 mx-auto mb-2" />
<p className="text-2xl font-bold text-blue-600">{profile?.points || 0}</p>
<p className="text-gray-600 text-sm">Total Points</p>
</CardContent>
</Card>
<Card>
<CardContent className="p-6 text-center">
<Calendar className="w-8 h-8 text-green-500 mx-auto mb-2" />
<p className="text-2xl font-bold text-green-600">{profile?.events_attended || 0}</p>
<p className="text-gray-600 text-sm">Events Attended</p>
</CardContent>
</Card>
<Card>
<CardContent className="p-6 text-center">
<TrendingUp className="w-8 h-8 text-purple-500 mx-auto mb-2" />
<p className="text-2xl font-bold text-purple-600">{profile?.impact_score || 0}</p>
<p className="text-gray-600 text-sm">Impact Score</p>
</CardContent>
</Card>
<Card>
<CardContent className="p-6 text-center">
<Trophy className="w-8 h-8 text-orange-500 mx-auto mb-2" />
<p className="text-2xl font-bold text-orange-600">{profile?.badges?.length || 0}</p>
<p className="text-gray-600 text-sm">Badges Earned</p>
</CardContent>
</Card>
</div>
{/* Badges Section */}
<Card>
<CardHeader>
<CardTitle className="flex items-center space-x-2">
<Star className="w-5 h-5" />
<span>Badges & Achievements</span>
</CardTitle>
<CardDescription>Recognition for your environmental impact</CardDescription>
</CardHeader>
<CardContent>
{profile?.badges && profile.badges.length > 0 ? (
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
{profile.badges.map((badge, index) => (
<div key={index} className="text-center p-4 bg-gradient-to-br from-yellow-50 to-orange-50 rounded-lg border">
<Award className="w-8 h-8 text-yellow-500 mx-auto mb-2" />
<p className="font-semibold text-sm">{badge}</p>
</div>
))}
</div>
) : (
<div className="text-center py-8">
<Target className="w-16 h-16 text-gray-400 mx-auto mb-4" />
<p className="text-gray-600">No badges earned yet</p>
<p className="text-sm text-gray-500">Participate in events to earn your first badge!</p>
</div>
)}
</CardContent>
</Card>
{/* Recent Activities */}
<Card>
<CardHeader>
<CardTitle className="flex items-center space-x-2">
<Calendar className="w-5 h-5" />
<span>Recent Activities</span>
</CardTitle>
<CardDescription>Your latest environmental contributions</CardDescription>
</CardHeader>
<CardContent>
{loading ? (
<div className="text-center py-4">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto"></div>
</div>
) : recentActivities.length > 0 ? (
<div className="space-y-4">
{recentActivities.map((activity, index) => (
<div key={index} className="flex items-center justify-between p-4 bg-gray-50 rounded-lg">
<div className="flex items-center space-x-3">
<div className="w-2 h-2 bg-green-500 rounded-full"></div>
<div>
<h4 className="font-semibold">{activity.events?.title}</h4>
<p className="text-sm text-gray-600">
{activity.events?.location} • {new Date(activity.events?.date).toLocaleDateString()}
</p>
</div>
</div>
<Badge className="bg-green-100 text-green-800">
+{activity.points_awarded} pts
</Badge>
</div>
))}
</div>
) : (
<div className="text-center py-8">
<Calendar className="w-16 h-16 text-gray-400 mx-auto mb-4" />
<p className="text-gray-600">No recent activities</p>
<p className="text-sm text-gray-500">Join events to see your activities here!</p>
</div>
)}
</CardContent>
</Card>
</div>
);
};
export default ProfileStats;
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\components\volunteer\SarthiBot.tsx ==========
import { useState, useRef, useEffect } from 'react';
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
import { Button } from '@/components/ui/button';
import { Input } from '@/components/ui/input';
import { Badge } from '@/components/ui/badge';
import { MessageSquare, Send, Mic, MicOff, Volume2 } from 'lucide-react';
interface Message {
id: string;
text: string;
sender: 'user' | 'bot';
timestamp: Date;
}
const SarthiBot = () => {
const [messages, setMessages] = useState<Message[]>([
{
id: '1',
text: "Hi! I'm SarthiBot, your beach cleanup assistant. Ask me about events, eco-tips, or anything related to ocean conservation!",
sender: 'bot',
timestamp: new Date(),
}
]);
const [inputText, setInputText] = useState('');
const [isListening, setIsListening] = useState(false);
const [isLoading, setIsLoading] = useState(false);
const messagesEndRef = useRef<HTMLDivElement>(null);
const scrollToBottom = () => {
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
};
useEffect(() => {
scrollToBottom();
}, [messages]);
const sendMessage = async (text: string) => {
if (!text.trim()) return;
const userMessage: Message = {
id: Date.now().toString(),
text: text.trim(),
sender: 'user',
timestamp: new Date(),
};
setMessages(prev => [...prev, userMessage]);
setInputText('');
setIsLoading(true);
// Simulate AI response
setTimeout(() => {
const botResponse: Message = {
id: (Date.now() + 1).toString(),
text: getBotResponse(text),
sender: 'bot',
timestamp: new Date(),
};
setMessages(prev => [...prev, botResponse]);
setIsLoading(false);
}, 1000);
};
const getBotResponse = (userText: string): string => {
const lowerText = userText.toLowerCase();
if (lowerText.includes('event') || lowerText.includes('cleanup')) {
return "Great question! There are several beach cleanup events happening in Mumbai this week. Check the 'Discover' tab to see events near Juhu Beach, Marine Drive, and Versova Beach. Would you like me to help you register for one?";
}
if (lowerText.includes('point') || lowerText.includes('badge')) {
return "You earn points by participating in events! Beach cleanups give 50-100 points, educational modules give 25 points, and referring friends gives 30 points. You can unlock badges like 'Ocean Guardian' and 'Cleanup Champion'!";
}
if (lowerText.includes('plastic') || lowerText.includes('waste')) {
return "Here's an eco-tip: Microplastics are one of the biggest threats to marine life. Always sort waste during cleanups - plastics, glass, and organic matter should be separated. Every piece you collect makes a difference!";
}
if (lowerText.includes('mumbai') || lowerText.includes('beach')) {
return "Mumbai has amazing beaches perfect for cleanup activities! Juhu Beach is great for beginners, Marine Drive for urban cleanup, and Versova Beach has shown incredible transformation through community efforts. Which one interests you most?";
}
return "That's an interesting question! I'm here to help with beach cleanup activities, event information, and eco-tips. You can also ask me about points, badges, or conservation techniques. What would you like to know more about?";
};
const handleVoiceToggle = () => {
setIsListening(!isListening);
// Voice recognition would be implemented here
if (!isListening) {
// Start listening
setTimeout(() => {
setIsListening(false);
sendMessage("I heard you say something about beach cleanup events!");
}, 3000);
}
};
const speakText = (text: string) => {
if ('speechSynthesis' in window) {
const utterance = new SpeechSynthesisUtterance(text);
speechSynthesis.speak(utterance);
}
};
return (
<Card className="h-[600px] flex flex-col">
<CardHeader className="bg-gradient-to-r from-blue-500 to-cyan-500 text-white rounded-t-lg">
<CardTitle className="flex items-center space-x-2">
<MessageSquare className="w-5 h-5" />
<span>SarthiBot - AI Assistant</span>
</CardTitle>
<CardDescription className="text-blue-100">
Your friendly beach cleanup companion
</CardDescription>
</CardHeader>
<CardContent className="flex-1 flex flex-col p-0">
{/* Messages */}
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.map((message) => (
<div
key={message.id}
className={`flex ${message.sender === 'user' ? 'justify-end' : 'justify-start'}`}
>
<div
className={`max-w-[80%] p-3 rounded-lg ${
message.sender === 'user'
? 'bg-blue-500 text-white'
: 'bg-gray-100 text-gray-800'
}`}
>
<p className="text-sm">{message.text}</p>
{message.sender === 'bot' && (
<Button
size="sm"
variant="ghost"
className="mt-2 h-6 px-2 text-xs"
onClick={() => speakText(message.text)}
>
<Volume2 className="w-3 h-3" />
</Button>
)}
</div>
</div>
))}
{isLoading && (
<div className="flex justify-start">
<div className="bg-gray-100 p-3 rounded-lg">
<div className="flex space-x-1">
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce"></div>
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" style={{ animationDelay: '0.1s' }}></div>
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" style={{ animationDelay: '0.2s' }}></div>
</div>
</div>
</div>
)}
<div ref={messagesEndRef} />
</div>
{/* Input */}
<div className="border-t p-4">
<div className="flex space-x-2">
<Input
value={inputText}
onChange={(e) => setInputText(e.target.value)}
placeholder="Ask me anything about beach cleanup..."
onKeyPress={(e) => e.key === 'Enter' && sendMessage(inputText)}
className="flex-1"
/>
<Button
onClick={handleVoiceToggle}
variant={isListening ? "default" : "outline"}
size="sm"
className={isListening ? "bg-red-500 hover:bg-red-600" : ""}
>
{isListening ? <MicOff className="w-4 h-4" /> : <Mic className="w-4 h-4" />}
</Button>
<Button onClick={() => sendMessage(inputText)} size="sm">
<Send className="w-4 h-4" />
</Button>
</div>
{/* Quick Actions */}
<div className="flex flex-wrap gap-2 mt-3">
<Badge
variant="outline"
className="cursor-pointer hover:bg-blue-50"
onClick={() => sendMessage("Show me upcoming events")}
>
Upcoming Events
</Badge>
<Badge
variant="outline"
className="cursor-pointer hover:bg-blue-50"
onClick={() => sendMessage("How do I earn more points?")}
>
Earning Points
</Badge>
<Badge
variant="outline"
className="cursor-pointer hover:bg-blue-50"
onClick={() => sendMessage("Tell me about Mumbai beaches")}
>
Mumbai Beaches
</Badge>
</div>
</div>
</CardContent>
</Card>
);
};
export default SarthiBot;
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\hooks\use-mobile.tsx ==========
import * as React from "react"
const MOBILE_BREAKPOINT = 768
export function useIsMobile() {
const [isMobile, setIsMobile] = React.useState<boolean | undefined>(undefined)
React.useEffect(() => {
const mql = window.matchMedia(`(max-width: ${MOBILE_BREAKPOINT - 1}px)`)
const onChange = () => {
setIsMobile(window.innerWidth < MOBILE_BREAKPOINT)
}
mql.addEventListener("change", onChange)
setIsMobile(window.innerWidth < MOBILE_BREAKPOINT)
return () => mql.removeEventListener("change", onChange)
}, [])
return !!isMobile
}
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\hooks\use-toast.ts ==========
import * as React from "react"
import type {
ToastActionElement,
ToastProps,
} from "@/components/ui/toast"
const TOAST_LIMIT = 1
const TOAST_REMOVE_DELAY = 1000000
type ToasterToast = ToastProps & {
id: string
title?: React.ReactNode
description?: React.ReactNode
action?: ToastActionElement
}
const actionTypes = {
ADD_TOAST: "ADD_TOAST",
UPDATE_TOAST: "UPDATE_TOAST",
DISMISS_TOAST: "DISMISS_TOAST",
REMOVE_TOAST: "REMOVE_TOAST",
} as const
let count = 0
function genId() {
count = (count + 1) % Number.MAX_SAFE_INTEGER
return count.toString()
}
type ActionType = typeof actionTypes
type Action =
| {
type: ActionType["ADD_TOAST"]
toast: ToasterToast
}
| {
type: ActionType["UPDATE_TOAST"]
toast: Partial<ToasterToast>
}
| {
type: ActionType["DISMISS_TOAST"]
toastId?: ToasterToast["id"]
}
| {
type: ActionType["REMOVE_TOAST"]
toastId?: ToasterToast["id"]
}
interface State {
toasts: ToasterToast[]
}
const toastTimeouts = new Map<string, ReturnType<typeof setTimeout>>()
const addToRemoveQueue = (toastId: string) => {
if (toastTimeouts.has(toastId)) {
return
}
const timeout = setTimeout(() => {
toastTimeouts.delete(toastId)
dispatch({
type: "REMOVE_TOAST",
toastId: toastId,
})
}, TOAST_REMOVE_DELAY)
toastTimeouts.set(toastId, timeout)
}
export const reducer = (state: State, action: Action): State => {
switch (action.type) {
case "ADD_TOAST":
return {
...state,
toasts: [action.toast, ...state.toasts].slice(0, TOAST_LIMIT),
}
case "UPDATE_TOAST":
return {
...state,
toasts: state.toasts.map((t) =>
t.id === action.toast.id ? { ...t, ...action.toast } : t
),
}
case "DISMISS_TOAST": {
const { toastId } = action
// ! Side effects ! - This could be extracted into a dismissToast() action,
// but I'll keep it here for simplicity
if (toastId) {
addToRemoveQueue(toastId)
} else {
state.toasts.forEach((toast) => {
addToRemoveQueue(toast.id)
})
}
return {
...state,
toasts: state.toasts.map((t) =>
t.id === toastId || toastId === undefined
? {
...t,
open: false,
}
: t
),
}
}
case "REMOVE_TOAST":
if (action.toastId === undefined) {
return {
...state,
toasts: [],
}
}
return {
...state,
toasts: state.toasts.filter((t) => t.id !== action.toastId),
}
}
}
const listeners: Array<(state: State) => void> = []
let memoryState: State = { toasts: [] }
function dispatch(action: Action) {
memoryState = reducer(memoryState, action)
listeners.forEach((listener) => {
listener(memoryState)
})
}
type Toast = Omit<ToasterToast, "id">
function toast({ ...props }: Toast) {
const id = genId()
const update = (props: ToasterToast) =>
dispatch({
type: "UPDATE_TOAST",
toast: { ...props, id },
})
const dismiss = () => dispatch({ type: "DISMISS_TOAST", toastId: id })
dispatch({
type: "ADD_TOAST",
toast: {
...props,
id,
open: true,
onOpenChange: (open) => {
if (!open) dismiss()
},
},
})
return {
id: id,
dismiss,
update,
}
}
function useToast() {
const [state, setState] = React.useState<State>(memoryState)
React.useEffect(() => {
listeners.push(setState)
return () => {
const index = listeners.indexOf(setState)
if (index > -1) {
listeners.splice(index, 1)
}
}
}, [state])
return {
...state,
toast,
dismiss: (toastId?: string) => dispatch({ type: "DISMISS_TOAST", toastId }),
}
}
export { useToast, toast }
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\hooks\useAuth.tsx ==========
import { createContext, useContext, useState, useEffect, ReactNode } from 'react';
import { User, Session } from '@supabase/supabase-js';
import { supabase } from '@/integrations/supabase/client';
interface Profile {
id: string;
name: string;
role: 'volunteer' | 'organizer';
points: number;
badges: string[];
events_attended: number;
impact_score: number;
}
interface AuthContextType {
user: User | null;
profile: Profile | null;
session: Session | null;
isAuthenticated: boolean;
isLoading: boolean;
signIn: (email: string, password: string) => Promise<{ error: any }>;
signUp: (email: string, password: string, name: string, role: 'volunteer' | 'organizer') => Promise<{ error: any }>;
signOut: () => Promise<void>;
}
const AuthContext = createContext<AuthContextType | null>(null);
export const useAuth = () => {
const context = useContext(AuthContext);
if (!context) {
throw new Error('useAuth must be used within an AuthProvider');
}
return context;
};
interface AuthProviderProps {
children: ReactNode;
}
export const AuthProvider = ({ children }: AuthProviderProps) => {
const [user, setUser] = useState<User | null>(null);
const [profile, setProfile] = useState<Profile | null>(null);
const [session, setSession] = useState<Session | null>(null);
const [isLoading, setIsLoading] = useState(true);
const fetchProfile = async (userId: string) => {
try {
console.log('Fetching profile for user:', userId);
const { data, error } = await supabase
.from('profiles')
.select('*')
.eq('id', userId)
.single();
if (error) {
console.error('Error fetching profile:', error);
return null;
}
console.log('Profile data fetched:', data);
// Cast the role to the correct type
return {
...data,
role: data.role as 'volunteer' | 'organizer'
} as Profile;
} catch (error) {
console.error('Error fetching profile:', error);
return null;
}
};
useEffect(() => {
console.log('Setting up auth state listener');
// Set up auth state listener
const { data: { subscription } } = supabase.auth.onAuthStateChange(
async (event, session) => {
console.log('Auth state changed:', event, session?.user?.email);
setSession(session);
setUser(session?.user ?? null);
if (session?.user) {
// Fetch user profile data
setTimeout(async () => {
const profileData = await fetchProfile(session.user.id);
console.log('Setting profile data:', profileData);
setProfile(profileData);
setIsLoading(false);
}, 0);
} else {
setProfile(null);
setIsLoading(false);
}
}
);
// Check for existing session
supabase.auth.getSession().then(({ data: { session } }) => {
console.log('Initial session check:', session?.user?.email);
setSession(session);
setUser(session?.user ?? null);
if (session?.user) {
fetchProfile(session.user.id).then((profileData) => {
console.log('Initial profile data:', profileData);
setProfile(profileData);
setIsLoading(false);
});
} else {
setIsLoading(false);
}
});
return () => subscription.unsubscribe();
}, []);
const signIn = async (email: string, password: string) => {
setIsLoading(true);
const { error } = await supabase.auth.signInWithPassword({
email,
password,
});
if (error) {
setIsLoading(false);
}
return { error };
};
const signUp = async (email: string, password: string, name: string, role: 'volunteer' | 'organizer') => {
setIsLoading(true);
const redirectUrl = `${window.location.origin}/`;
const { error } = await supabase.auth.signUp({
email,
password,
options: {
emailRedirectTo: redirectUrl,
data: {
name,
role,
}
}
});
if (error) {
setIsLoading(false);
}
return { error };
};
const signOut = async () => {
console.log('Starting sign out process...');
setIsLoading(true);
try {
const { error } = await supabase.auth.signOut();
if (error) {
console.error('Error during sign out:', error);
throw error;
}
// Clear local state immediately
setUser(null);
setProfile(null);
setSession(null);
console.log('Sign out completed successfully');
} catch (error) {
console.error('Sign out failed:', error);
throw error;
} finally {
setIsLoading(false);
}
};
const value: AuthContextType = {
user,
profile,
session,
isAuthenticated: !!session,
isLoading,
signIn,
signUp,
signOut,
};
console.log('Auth context value:', { isAuthenticated: !!session, profile, isLoading });
return (
<AuthContext.Provider value={value}>
{children}
</AuthContext.Provider>
);
};
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\hooks\useCreateEventForm.ts ==========
import { useState } from 'react';
import { useAuth } from '@/hooks/useAuth';
import { supabase } from '@/integrations/supabase/client';
import { useToast } from '@/hooks/use-toast';
export interface EventFormData {
title: string;
description: string;
location: string;
date: string;
time: string;
maxVolunteers: string;
category: string;
}
export const useCreateEventForm = (onEventCreated: (event: any) => void, onClose: () => void) => {
const { user } = useAuth();
const { toast } = useToast();
const [formData, setFormData] = useState<EventFormData>({
title: '',
description: '',
location: '',
date: '',
time: '',
maxVolunteers: '',
category: 'beach-cleanup'
});
const [loading, setLoading] = useState(false);
const [errors, setErrors] = useState<Record<string, string>>({});
const validateForm = () => {
const newErrors: Record<string, string> = {};
if (!formData.title.trim()) newErrors.title = 'Event title is required';
if (!formData.description.trim()) newErrors.description = 'Description is required';
if (!formData.location.trim()) newErrors.location = 'Location is required';
if (!formData.date) newErrors.date = 'Date is required';
if (!formData.time) newErrors.time = 'Time is required';
if (!formData.maxVolunteers || parseInt(formData.maxVolunteers) < 1) {
newErrors.maxVolunteers = 'Valid number of volunteers is required';
}
setErrors(newErrors);
return Object.keys(newErrors).length === 0;
};
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!validateForm() || !user) return;
setLoading(true);
try {
console.log('Creating event with data:', formData);
const { data: newEvent, error } = await supabase
.from('events')
.insert({
title: formData.title,
description: formData.description,
location: formData.location,
date: formData.date,
time: formData.time,
max_volunteers: parseInt(formData.maxVolunteers),
organizer_id: user.id,
status: 'Open',
difficulty: 'Easy',
points_reward: 50,
equipment: []
})
.select()
.single();
if (error) {
console.error('Error creating event:', error);
toast({
title: "Error",
description: "Failed to create event. Please try again.",
variant: "destructive",
});
return;
}
console.log('Event created successfully:', newEvent);
toast({
title: "Success!",
description: "Event created successfully!",
});
onEventCreated(newEvent);
// Reset form
setFormData({
title: '',
description: '',
location: '',
date: '',
time: '',
maxVolunteers: '',
category: 'beach-cleanup'
});
setErrors({});
onClose();
} catch (error) {
console.error('Error in handleSubmit:', error);
toast({
title: "Error",
description: "An unexpected error occurred.",
variant: "destructive",
});
} finally {
setLoading(false);
}
};
const handleChange = (field: string, value: string) => {
setFormData(prev => ({ ...prev, [field]: value }));
if (errors[field]) {
setErrors(prev => ({ ...prev, [field]: '' }));
}
};
return {
formData,
errors,
loading,
handleSubmit,
handleChange
};
};
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\integrations\supabase\client.ts ==========
// This file is automatically generated. Do not edit it directly.
import { createClient } from '@supabase/supabase-js';
import type { Database } from './types';
const SUPABASE_URL = "https://mbofnsqwfjmgmjlxjcxw.supabase.co";
const SUPABASE_PUBLISHABLE_KEY = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6Im1ib2Zuc3F3ZmptZ21qbHhqY3h3Iiwicm9sZSI6ImFub24iLCJpYXQiOjE3NTA3ODM2ODEsImV4cCI6MjA2NjM1OTY4MX0.DKFASdj-5udhXz8wX_ztIeBtvgrfjsyCzHpHHuro60U";
// Import the supabase client like this:
// import { supabase } from "@/integrations/supabase/client";
export const supabase = createClient<Database>(SUPABASE_URL, SUPABASE_PUBLISHABLE_KEY);
========== C:\Users\satya\Downloads\sarthi-beach-brigade-main\sarthi-beach-brigade-main\src\integrations\supabase\types.ts ==========
export type Json =
| string
| number
| boolean
| null
| { [key: string]: Json | undefined }
| Json[]
export type Database = {
public: {
Tables: {
event_attendance: {
Row: {
attended: boolean | null
event_id: string
id: string
marked_at: string | null
marked_by: string | null
points_awarded: number | null
volunteer_id: string
}
Insert: {
attended?: boolean | null
event_id: string
id?: string
marked_at?: string | null
marked_by?: string | null
points_awarded?: number | null
volunteer_id: string
It's not LiveKitClient (upper case 'K'), it's LivekitClient (lower case 'k')
I guess you accidentally put ':' into your path.
Also make sure that HADOOP_HOME is set correctly, because one will get ClassNotFound exception with the same libs your mentioned, when trying to run hive CLI.
According to documentation HADOOP_HOME is the requirement.
Although it's written that hive 4.0.1 works with Hadoop 3.3.6, I've just checked it with Hadoop 3.4.1 and It works (despite some warns of multiple SLF4J bindings, which should be fixed by devops or devs)
PS: I don't quite get what does 'query data on apache spark with hive' means. Usually, spark utilizes hive metastore to query Hive's tables.
Follow below steps.
Open Visual Studio 2022.
Click Create a new project.
Search for ASP.NET Core Empty.
Select the ASP.NET Core Empty template and click Next.
Choose .NET 8.0 as the target framework.
Click Create.
This template will give you the setup with just the Program.cs and Startup.cs , and no Razor Pages or MVC.
This error occurs because node-abi (which electron-builder uses internally) doesn't recognize Electron version 37.1.0.
Try with a node-abi version released here:
Sadly it is not possible to change the default behaviour of the SnackbarHost. Therefore it will always stretch the full width of the screen.
Thus if you want it to just wrap the content you would have to make a custom snackbar like you mentioned.
@xdurch0 answered this question in a comment. The solution is to set axis=1. The numbers aren't actually garbage. I was expecting stuff like 0.123 or 0.543, so I thought they were garbage. They actually do sum up to 1.0.
I am taking example of Java as I am comfortable with it. In a single JVM, locks like synchronized or ReentrantLock work well.
But in a clustered environment (e.g. multiple Spring Boot apps behind a load balancer), each node has its own memory, so:
Locking in memory doesn’t prevent another node from performing the same action.
You need a shared coordination mechanism — like Redis, Zookeeper etc...
After trying everything and a good night sleep I got enlightend....
The error was not in the service but in the keycloak settings for the api user. There was 2 things in place that hide the real error.
Same issue I found my ppt is generating according to my requirements but getting repair issues I have tried many way to resolve it but unable to remove it.
Please any one who can help me.. I have Used OpenXML and HtmlAgilityPack
You’ve already sniffed out the right ROUTER–DEALER pattern. But there is a nuance here: the broker must keep track of which client request is handled by which backend, so the response goes back to the right client.
You can try Router-Dealer-Router pattern. This is why this will work:
Frontend: ROUTER socket (bind TCP) — talks to clients.
Broker middle: DEALER socket — talks to all backend handler threads.
Backends: each backend is a ROUTER socket, connected to the broker DEALER over inproc://.
So, the chain is:
CLIENT <---> ROUTER (broker frontend) <---> DEALER (broker backend) <---> ROUTER (per backend)
This lets you:
Use the built-in zmq_proxy() for non-blocking fair queuing.
Keep request identity frames intact.
Have each backend handle its own routing logic for responses.
To avoid using
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Try this plugin it will use device download manager to download file and also show progress notification
You write that it works with scan_csv. Looking at the documentation scan_csv seems to be the only option to support glob patterns.
read_csv: Read a CSV file into a DataFrame.
scan_csv: Lazily read from a CSV file or multiple files via glob patterns.
I have found a simple workaround in cases where you cannot use subqueries...
SELECT AVG(COALESCE([Ship Cost], 0)) * COUNT(DISTINCT [Tracking #]) ... GROUP BY [ORDER #]
Perhaps not the answer to the OP's situation, but I experienced the same issue recently and after digging around for ages, the problem was that Firefox needed to be granted Local Network permissions in macOS Settings > Privacy & Security. Sigh.
It seems to be an issue of the pytest-html package. An issue has been raised.
This is a very specific problem with some libraries in kotlin-dsl
Replace
implementation ("com.github.gbenroscience:parserng-android:0.1.1")
with
implementation (platform("com.github.gbenroscience:parserng-android:0.1.1"))
& you are good to go.