If you observe the effects of XHSETT (or other tools like USB3CV) after rebooting the system, then most likely the application failed to switch back to the normal USB stack. My solution to the problem is to open Device Manager, navigate to USB Compliance (Host/Device) Controllers and uninstall the xHCI Compliance Test Host Controller (all of them if there is more than one). After doing so, just press Scan for hardware changes button and standard USB drivers will install.
tried adding
.WithRequestTimeout(TimeSpan.FromSeconds(15));
to MapGet with your desired timeout ?
okay i found the reason i should import the ApolloProvider like this
import { ApolloProvider } from '@apollo/client/react'
I had this error. It can also be caused by IIS Application Pool Setting having "Load user profile: false", which in our case was the default because we had installed "IIS6 compatibility" on the server.
Took me too many hours of googling to find the problem. Hopefully, I can help some other poor soul in the future.
See here: https://www.advancedinstaller.com/forums/viewtopic.php?t=26039
Single Site – One Magento installation with one website, one store, and one store view.
Multi-Store – One Magento installation managing multiple stores under the same website or different websites.
Multi-Store View – Different views (like languages or currencies) of a single store for localization or customization.
✅ Summary: Single site = 1 store, Multi-store = multiple stores, Multi-store view = multiple views of a store.
Good explanations on the diffing and performance parts, but missing a few points — content set with dangerouslySetInnerHTML isn’t managed by React (no React event handlers inside), and using innerHTML directly breaks React’s declarative model. Also note possible SSR/hydration mismatches and that the __html object is intentional to force explicit use.
I am wondering while calling this function named forward you were passing the wrong (type) argument
we should pass the argument with type Tensor or if you share more about this it would be better
The problem solved itself after updating visual studio. After trying everything I could come up with I sent the project to a colleague who could debug it. Apparently it was related to my VS installation.
Yes in your child window message handler, ensure that the WM_MOUSEMOVE messages are sent to DefWindowProc (that will send them on to the parent).
In Logic app, it's quite simple to retrieve any field value from Json structure. Below is a sample code snippet for your reference/solution.
{
    "definition": {
        "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
        "contentVersion": "1.0.0.0",
        "triggers": {
            "When_an_HTTP_request_is_received": {
                "type": "Request",
                "kind": "Http"
            }
        },
        "actions": {
            "Parse_JSON": {
                "type": "ParseJson",
                "inputs": {
                    "content": "@triggerBody()",
                    "schema": {
                        "type": "object",
                        "properties": {
                            "statement_id": {
                                "type": "string"
                            },
                            "status": {
                                "type": "object",
                                "properties": {
                                    "state": {
                                        "type": "string"
                                    }
                                }
                            },
                            "manifest": {
                                "type": "object",
                                "properties": {
                                    "format": {
                                        "type": "string"
                                    },
                                    "schema": {
                                        "type": "object",
                                        "properties": {
                                            "column_count": {
                                                "type": "integer"
                                            },
                                            "columns": {
                                                "type": "array",
                                                "items": {
                                                    "type": "object",
                                                    "properties": {
                                                        "name": {
                                                            "type": "string"
                                                        },
                                                        "type_text": {
                                                            "type": "string"
                                                        },
                                                        "type_name": {
                                                            "type": "string"
                                                        },
                                                        "position": {
                                                            "type": "integer"
                                                        }
                                                    },
                                                    "required": [
                                                        "name",
                                                        "type_text",
                                                        "type_name",
                                                        "position"
                                                    ]
                                                }
                                            }
                                        }
                                    },
                                    "total_chunk_count": {
                                        "type": "integer"
                                    },
                                    "chunks": {
                                        "type": "array",
                                        "items": {
                                            "type": "object",
                                            "properties": {
                                                "chunk_index": {
                                                    "type": "integer"
                                                },
                                                "row_offset": {
                                                    "type": "integer"
                                                },
                                                "row_count": {
                                                    "type": "integer"
                                                }
                                            },
                                            "required": [
                                                "chunk_index",
                                                "row_offset",
                                                "row_count"
                                            ]
                                        }
                                    },
                                    "total_row_count": {
                                        "type": "integer"
                                    },
                                    "truncated": {
                                        "type": "boolean"
                                    }
                                }
                            },
                            "result": {
                                "type": "object",
                                "properties": {
                                    "chunk_index": {
                                        "type": "integer"
                                    },
                                    "row_offset": {
                                        "type": "integer"
                                    },
                                    "row_count": {
                                        "type": "integer"
                                    },
                                    "data_array": {
                                        "type": "array",
                                        "items": {
                                            "type": "array"
                                        }
                                    }
                                }
                            }
                        }
                    }
                },
                "runAfter": {}
            },
            "Response": {
                "type": "Response",
                "kind": "Http",
                "inputs": {
                    "statusCode": 200,
                    "body": "@{body('Parse_JSON')?['result']?['data_array'][0]?[0]}\n@{body('Parse_JSON')?['result']?['data_array'][1]?[0]}"
                },
                "runAfter": {
                    "Parse_JSON": [
                        "Succeeded"
                    ]
                }
            }
        },
        "outputs": {},
        "parameters": {
            "$connections": {
                "type": "Object",
                "defaultValue": {}
            }
        }
    },
    "parameters": {
        "$connections": {
            "type": "Object",
            "value": {}
        }
    }
}
Try string.Equals(str1, str2, StringComparison.OrdinalIgnoreCase)
Daman Game App makes online gaming super convenient. You can play, earn, and track your performance in one place. I also appreciate the regular updates that keep the app fresh and bug-free. Definitely one of my favorite gaming apps now!
Did you find the workaround? I tried setting up our local backend so it doesn’t depend on alpha.jitsi.net, and the timeout issue is gone. But now the global CSS is overriding everything, and webpack internals aren’t loading.
It is because you didn't write a rule for \n, so the default action is to echo it to the output. You need a rule to ignore all whitespace, something like [ \t\r\n\f]+ ;.
Anything between [ and ] means 'any of these characters'. The characters are space, TAB, CR, LF, FF. The + means 'one or more'. The ; is the C++ code to execute: in this case, an empty statement, meaning 'do nothing'.
Firstly, for safety, only copy the whole project containing virtual env, assuming you have python libraries/modules installed in it. And paste it into different place or folder Open editor , open terminal (cmd), go into the directory where venv is present and activate it
e.g.
Another choice:
https://github.com/Merci-chao/userChrome.js#multi-tab-rows
Highlights
Tab Groups Support: Fully supports mouse operations for tab groups — even in multi-row mode — delivering a smoother, more graceful experience.
Enhanced Tab Animations: Adds fluid transitions for various tab-related actions.
Optimized Space Usage: Makes full use of available UI space, including the area beneath window control buttons.
Smooth Tab-Dragging Animation: Supports animated tab dragging even in multi-row mode.
Pinned Tabs Grid Layout: Pinned tabs are fixed in a compact grid when Tabs Bar is scrollable — ideal for managing large numbers of pinned tabs.
Native-Like Firefox Integration: Seamlessly aligns with Firefox’s behavior to support multi-row tabs as if natively built-in.
Theme Compatibility: Fully compatible with themes, regardless of how many tab rows are present.
I was encountering a similar error. My solution was placing the -f flag for tar as the last flag closes to the file name.
I ran into this issue too, on iOS 26
You can get SQL_TEXT/SQL_FULLTEXT from V$SQLAREA by PROGRAM_ID, PROGRAM_LINE# , where PROGRAM_ID is OBJECT_ID from DBA_OBJECTS. But considering the dynamic nature of those packages, I don't think it's worth to reverse-engineer them.
In my case, I had to log into Docker using the desktop Docker application
What you are looking for is called "trampoline".
There's an implementation of trampoline that utilizes generators, which I think is very elegant. The basic idea is that when you yield a generator (B) in a gnenrator (A), the executor will run that generator (B) and send the return value back to the caller generator (A).
Apparently, it's been talked about since 2022 - https://github.com/microsoft/TypeScript/issues/51556
there is a hacky way to do this, I'm wondering if anyone knows a better way
type Foo = {
   a?: string; 
}
const getFooWithSatisfiesError = (function(){
    if(Math.random()) {
        return null; // this will result in type error
    }
    return { a: '1', 'b': 2 }
}) satisfies () => Foo
getFooWithSatisfiesError() 
// const getFooWithSatisfiesError: () => {
//    a: string;
//    b: number;
}
create a simple text file insert the encoded data into it and use the following command
certutil -decode 1.txt 2.txt
the decoded text will be saved in 2.txt
I know I'm late to the thread here, but what you really want is a native Oracle Advanced Queuing (AQ) object (supports FIFO, LIFO, etc.). Queuing has been supported natively in the Oracle database for decades. Learn more here:
You would create an AQ queue and associate that AQ object to your DBMS_SCHEDULER job. Then you just enqueue a messages to trigger and event and AQ kicks off the job. Here's the Oracle documentation:
Reading your use case, had you known about Oracle's native AQ feature, you might not have needed to create the DBMS_SCHEDULER artifacts at all. Your process the submit jobs would instead only need to "enqueue messages" and the setup of the AQ object determine how you want things run (1 at time, 50 at a time, LIFO, FIFO, etc.). Here's a great Ask Tom posting from back in 2012 that demonstrates the full setup with code:
Note: the URLs I cited above are for version 19c of the Oracle database. Just change the number in the URL from 19 to 23 to go the 23ai version which has many more enhancements for feature you'll never get around to using because there are so many!
Adding this line into global.css right after @import "tailwindcss" fixes the problem.
@import "tailwindcss";
@custom-variant hover (&:hover);
This solution was originally intended to fix the issue where hover effects don’t work on devices that don’t explicitly support :hover, like the touchscreen laptop mentioned in the Reddit post. It effectively resolves the problem, and I haven’t encountered any issues with it.
Here is the reasoning from the inventor of the F1 score, C.J. Rijsbergen, in his 1979 PhD thesis.
Define the set A of all positive items (|A|=TP+FN) and the set B of all items classified as positive (|B|=TP+FP). The "symmetric difference" A-B is all items that appear in A or B but not both. These are the false positives and false negatives (|A-B|=FP+FN).
We want to minimize the size of A-B, which ranges from 0 to |A|+|B|. Rijsbergen argues that in fact we want to minimize a normalized size of A-B, defined as enter image description here
Since we are looking for a "performance metric", ie something to maximize, let's instead define F=1-E and maximize that. Plugging in the definitions and crunching the algebra we get that this F is indeed the F1 score, as shown below.
Try checking if it is nested within some of the other options in the sidebar, like source control, and you can then drag it back to the sidebar.
Your current approach compares characters in order, which is why it only matches prefixes like "j" or "ja".
If you want to check whether the scrap string contains all the characters needed to form the target name (ignoring order and case, but keeping spaces), you should compare character frequencies instead of positions.
Take a look at this https://github.com/kbr-ucl/dapr-workshop-aspire/tree/aspire-challenge-3/start-here repository. It contains an aspire version of Dapr university tutorials
put
options(shiny.autoreload = FALSE)
in global.R
Finally it was a firewall issue.
I didn't thought about it because it works as root.
sudo firewall-cmd --add-port 8001/tcp solves my issue.
I had a similar issue and it was because of openssl in the newer ubuntu. You can install the older openssl by running the command
rvm pkg install openssl
and then when you install ruby you have to supply the openssl-dir that is provided after installing via rvm pkg
rvm install 3.0.0 --with-openssl-dir=/usr/share/rvm/usr
I don't think there's any automated way to discover which system libraries a package uses. You could find out manually by watching for errors when you try to run your image and adding any libraries that fail to load. As long as your Python dependencies aren't changing, the external dependencies should generally be unchanged also.
Oh, and did I mention, read the docs for each Python dependency, e.g. gdal:
Dependencies
- libgdal (3.11.4 or greater).
 
When I came across this error message today, it was because browser-sync was returning a 404 page to the browser: my javascript file was not in the same directory that the HTML expected.
You should be able to create your python 3.12 Linux code app on Azure and it would contain the runtime needed to run your web jobs. Also on your kudu console when you SSH to your container, you should be able to find the pip tool path:
I was facing the same issue with an Azure VM using Windows. Here was the fix:
Uncheck "Use Windows trust store":
On DBeaver, when using Windows, navigating to Window > Preferences > Connections and unchecking "Use Windows trust store".
https://stackoverflow.com/a/48593318
I hope that helps.
stty erase ^H
or
stty erase ^?
One could argue that, syntactically, any
problem+jsoncontent would also be validjsoncontent. So, by that rationale, accepting the latter could be seen as implying also accepting the former by extension.
No, +json types are not "subtypes" of application/json. I understand the rationale. application/json only says something about the syntax, not the format, and +json (and +xml) types say something about both the format and the syntax.
In other words, whilst we can assume that an Accept including application/json will be able to parse the problem JSON response, we cannot assume it will be able to process it.
I.e. if a request's
Acceptheader specifiesapplication/jsonbut notapplication/problem+json, would it be valid to deliver aapplication/problem+jsonresponse?
It depends on if you want to content-negotiate or not. It is acceptable to not content-negotiate when an error is encountered. Serving Content-Type of x ignoring Accept is fine. You are not negotiating any content-type, but interpreting y to mean x is not intended (or valid in terms of Content Negotiation).
The appropriate thing to do is to serve a 406 Not Acceptable (or 415 Unsupported Media Type, when you have the same question for processing a request body), and we can help the user by being really helpful. For example, when a user-agent sends our apis application/json (among other things), we will show a HTML page with:
This endpoint tried really hard to show you the information you requested. Unfortunately you specified in your Accept header that you only wanted to see the following types: application/json.
Please add one of the following types to your Accept header to see the content or error message:
application/vnd.delftsolutions.endpoint_description.v1+json
text/vnd.delftsolutions.docs
*/*
text/*
text/html
application/vnd.opus-safety.site.v1+json
application/vnd.opus-safety.site.v2+json
application/vnd.opus-safety.site.v3+json
image/svg+xml
image/*
In case of an error, application/problem+json will be in this list, which is automatically rendered.
We tell our API consumers to interpret the Content-Type to be able to extract the error message.
So I am unsure.
In general, when you're following standards and you are uncertain, doing what's most "helpful" to you can often be the best choice. In this particular case, responding with a problem when someone is requesting json will likely make it harder to debug logs, so I wouldn't recommend it!
The "update row count after insertion" comment is followed by a line that increments i by one.
But you've added a new row that doesn't seem to have a checkbox, and when you increment i by one, that's the new row it points to. It should probably change the increment of i to i = i + 2 in the first case (but leave it i+1 in the "else" case) in order to work properly.
Entity Framework Core (incl. 2.2) doesn’t have a first-class API for PostgreSQL table partitioning. You can still use code-first and migrations, but you must create the partitioned table and its partitions with raw SQL inside your migration(s), and then map your entity to the parent table.
i did it.
image on txt using hex!
https://www.mediafire.com/file/lnm2br6il7a0fhn/plain_text_image_maker_FIXED.html/file
https://www.mediafire.com/file/evdmkz6pfb91iz9/plain_text_image_viewer.zip/file
In my case, here is the solution (maybe someone needs a reference applying what @armstrove said):
# models.py
class PagexElement(models.Model):
    """Smallest reusable building block for page construction. Elements contain HTML templates with placeholders that can be customized when used in pages. One or more elements compose a section."""
    ...
class PagexSection(models.Model):
    """Collection of elements that form a reusable section. Sections are composed of one or more elements and define the structure for page layouts."""
    ...
class PagexInterSectElem(models.Model):
    """Intermediary model to handle the ordering of elements within a section. Allows the same element to appear in different positions across sections."""
    ...
    class Meta:
        unique_together = [["section", "element", "order"]]
# admin.py
from adminsortable2.admin import SortableAdminBase, SortableInlineAdminMixin
from django.contrib import admin
from . import forms, models
...
class PagexInterSectElemInline(SortableInlineAdminMixin, admin.TabularInline):
    model = models.PagexInterSectElem
    formset = forms.PagexInterSectElemFormSet
    ...
@admin.register(models.PagexSection)
class PagexSectionAdmin(SortableAdminBase, admin.ModelAdmin):
    """Customizes the management of layout sections."""
    ...
# forms.py
from adminsortable2 import admin
from django import forms
...
class PagexInterSectElemFormSet(admin.CustomInlineFormSet, forms.BaseInlineFormSet):
    """Custom formset that allows duplicate elements in different section positions."""
    def validate_unique(self):
        # Skip the default unique validation for 'element' field! Pagex only cares about section+order uniqueness (handled by DB constraint)!
        super().validate_unique()
    def _validate_unique_for_date_fields(self):
        # Override to prevent element uniqueness validation!
        pass
Cheers.
# code/python (save as bot.py)
"""
Secure minimal Telegram bot using python-telegram-bot (v20+)
Features:
- Token loaded from env var
- Admin-only commands (by telegram user_id)
- Safe DB access (sqlite + parameterized queries)
- Graceful error handling & rate limiting (simple)
- Example of using webhook (recommended) or polling fallback
"""
import os
import logging
import sqlite3
import time
from functools import wraps
from http import HTTPStatus
from telegram import Update, Bot
from telegram.ext import ApplicationBuilder, CommandHandler, ContextTypes, MessageHandler, filters
# --- Configuration (from environment) ---
BOT_TOKEN = os.getenv("TELEGRAM_BOT_TOKEN")
if not BOT_TOKEN:
raise SystemExit("TELEGRAM_BOT_TOKEN env var required")
ALLOWED_ADMINS = {int(x) for x in os.getenv("BOT_ADMINS", "").split(",") if x.strip()} # comma-separated IDs
WEBHOOK_URL = os.getenv("WEBHOOK_URL") # e.g. https://your.domain/path
DATABASE_PATH = os.getenv("BOT_DB_PATH", "bot_data.sqlite")
# --- Logging ---
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
logger = logging.getLogger("secure_bot")
# --- DB helpers (safe parameterized queries) ---
def init_db():
conn = sqlite3.connect(DATABASE_PATH, check_same_thread=False)
c = conn.cursor()
c.execute("""CREATE TABLE IF NOT EXISTS messages (
              id INTEGER PRIMARY KEY AUTOINCREMENT,
              user_id INTEGER,
              username TEXT,
              text TEXT,
              ts INTEGER
            )""")
conn.commit()
return conn
db = init_db()
# --- admin check decorator ---
def admin_only(func):
@wraps(func)
async def wrapper(update: Update, context: ContextTypes.DEFAULT_TYPE):
    user = update.effective_user
    if not user or user.id not in ALLOWED_ADMINS:
        logger.warning("Unauthorized access attempt by %s (%s)", user.id if user else "unknown", user.username if user else "")
        if update.effective_chat:
            await update.effective_chat.send_message("Unauthorized.")
        return
    return await func(update, context)
return wrapper
# --- Handlers ---
async def start(update: Update, context: ContextTypes.DEFAULT_TYPE):
await update.message.reply_text("Hello. This bot is configured securely. Use /help for commands.")
async def help_cmd(update: Update, context: ContextTypes.DEFAULT_TYPE):
await update.message.reply_text("/status - admin only\\n/echo \<text\> - echo back\\n/help - this message")
@admin_only
async def status(update: Update, context: ContextTypes.DEFAULT_TYPE):
await update.message.reply_text("OK — bot is running.")
async def echo(update: Update, context: ContextTypes.DEFAULT_TYPE):
\# store incoming message safely
try:
    msg = update.effective_message
    with db:
        db.execute("INSERT INTO messages (user_id, username, text, ts) VALUES (?, ?, ?, ?)",
                   (msg.from_user.id, msg.from_user.username or "", msg.text or "", int(time.time())))
    \# simple rate-limit: disallow messages \> 400 chars
    if msg.text and len(msg.text) \> 400:
        await msg.reply_text("Message too long.")
        return
    await msg.reply_text(msg.text or "Empty message.")
except Exception as e:
    logger.exception("Error in echo handler: %s", e)
    await update.effective_chat.send_message("Internal error.")
# --- basic command to rotate token reminder (admin only) ---
@admin_only
async def rotate_reminder(update: Update, context: ContextTypes.DEFAULT_TYPE):
await update.message.reply_text("Reminder: rotate token and update TELEGRAM_BOT_TOKEN env var on server.")
# --- Build application ---
async def main():
app = ApplicationBuilder().token(BOT_TOKEN).concurrent_updates(8).build()
app.add_handler(CommandHandler("start", start))
app.add_handler(CommandHandler("help", help_cmd))
app.add_handler(CommandHandler("status", status))
app.add_handler(CommandHandler("rotate", rotate_reminder))
app.add_handler(CommandHandler("echo", echo))
app.add_handler(MessageHandler(filters.TEXT & \~filters.COMMAND, echo))
\# Webhook preferred (more secure than polling) if WEBHOOK_URL provided
if WEBHOOK_URL:
    \# set webhook (TLS must be handled by your web server/reverse-proxy)
    bot = Bot(token=BOT_TOKEN)
    await bot.set_webhook(WEBHOOK_URL)
    logger.info("Webhook set to %s", WEBHOOK_URL)
    \# start the app (it will use long polling by default in local runner;
    \# for production you should run an ASGI server with endpoints calling app.update_queue)
    await app.initialize()
    await app.start()
    logger.info("Bot started with webhook mode (app running).")
    \# keep running until terminated
    await app.updater.stop()  # placeholder to keep structure consistent
else:
    \# fallback to polling (useful for dev only)
    logger.info("Starting in polling mode (development only).")
    await app.run_polling()
if _name_ == "_main_":
import asyncio
asyncio.run(main())
select distinct name from actor where id in (select actorid from casting where movieid in (
select movieid from casting where actorid in (
select id from actor
where name='Art Garfunkel'))) and name !='Art Garfunkel'
Ok so i have the same problem but I need to convert lua into python because i know lua but not python
The error occurs because passkey-based MFA, such as fingerprint or Face ID, is only supported for browser-based login, not for programmatic access. As you rightly mentioned, you can use key-pair authentication. Additionally, you can use a Programmatic access token (PAT) and a DUO MFA method, where you receive a push notification on your mobile device to log in to Snowflake. However, for DUO MFA, it is less ideal for automation as it still requires some user interaction.
If it were me, what I would do is create a function that takes a Param and returns a Query with the same data, call it toQuery() or something like that, and the same in reverse (a toParam() on the Query object - and then change the code to query[queryKey] = param[paramKey].toQuery()
Not that I've tested it, but it seems like that would remove a lot of your issues and probably all of them. In general, it makes sense to delegate to objects that need to turn themselves into other objects to do that themselves, rather than expect some supertyping or generic mechanism to do it for you in tricky cases like this one.
The following approach will keep the format of existing worksheets.
# Create an existing workbook from which we want to extract sheets
firstWb <- createWorkbook()
addWorksheet(firstWb, sheetName = "One")
addWorksheet(firstWb, sheetName = "Two")
addWorksheet(firstWb, sheetName = "Three")
writeData(firstWb, sheet = "One", x = matrix(1))
writeData(firstWb, sheet = "Two", x = matrix(2))
writeData(firstWb, sheet = "Three", x = matrix(3))
# Make a copy and remove sheets that we dont want to merge
theWb <- copyWorkbook(firstWb)
openxlsx::removeWorksheet(theWb, "One")
# Add new sheets
addWorksheet(theWb, sheetName = "Zero")
writeData(theWb, sheet = "Zero", x = matrix(0))
addWorksheet(theWb, sheetName = "Five")
writeData(theWb, sheet = "Five", x = matrix(5))
# Reorder sheets
nams <- 1:length(names(theWb))
names(nams) <- names(theWb)
worksheetOrder(theWb) <- nams[c("Zero", "Two", "Three", "Five")]
# Save
saveWorkbook(theWb, file = "Combined.xlsx")
If anyone found this answer helpful, please consider showing your appreciation with an upvote.
You should be able to simply use latest as your version_id or omit /versions/{version_id}. That'll default to latest.
source: https://cloud.google.com/secret-manager/docs/access-secret-version
I removed the extra space or newline while adding the origin and then pushed the changes
Have you tried to implement module1 in app? Since dagger auto generate code to work I suppose that when app generate the DaggerModule2Component, it needs to see also module1 to generate the underneath code. With your gradle settings app implements module2 but can't see module1 because implementation doesn't permit transitive dependency
Upon running
pip show qwen-vl-utils , I found that it requires av, packaging, pillow, requests . Each of these separately imported without error into Python with the exception of av.
I found that running:
pip uninstall av (to uninstall av from pip)
and then
conda install -c conda-forge av
to install it via conda fixed this issue with OpenSSL.
I thought I'd post this in case anyone else runs into this issue trying to run the new Qwen models or otherwise :)
I had a similar issue, which was caused by the fact that I wanted to break a list of unique items into blocks for parallel processing. My solution was the HashSet Chunk command, which eliminated my need for removing items from the HashSet entirely.
But I want it to look exactly the same as Apple’s Shortcuts widget — the grid layout should have the same proportions, spacing, and button sizes across all three widget sizes (small, medium, and large). For my Widget.
You should use the inspect tool at https://moleburrow.com/console/inspect to see the request and response headers. Everything will be clear there.
For ngrok, you can go to http://127.0.0.1:4040/inspect/http.
Make sure you use HTTPS; otherwise, the cookie will be ignored. Also, don’t use sameSite: 'none' if your backend and frontend are on the same domain.
With this approach, it's assumed that you have Date dimension table available which is very common. I am providing a snippet of the table that is used for this purpose.
Then create a function which does the job.
Call the function by passing any date and it returns the previous bussiness day date.
You can also downgrade the JDK to Java 8 if your requirements permit.
If SSL (TLS) pinning is configured through Info.plist, i.e. using NSAppTransportSecurity, as described in Apple's Identity Pinning: How to configure server certificates for your app post, it automatically becomes applied to AVPlayer's streams.
However, I don't have a source from Apple confirming this; only my own testing with Proxyman.
thanks for outlining the details! I might be way off-base here without seeing your Zap setup, but some suggestions are below.
The issue might be how AWTOMIC bundles passes data to Zapier differently in live orders versus testing.
When you test, Zapier may be expanding the bundle items into separate line items, but in live orders, AWTOMIC is likely passing bundle contents as line item properties or metadata rather than as separate line items.
You should check for the Shopify Line Item Properties. In your Zapier trigger step, look for something similar as these fields:
Line Items Properties Name
Line Items Properties Value
These often contain bundle item details so that you can track the name and quantity of each meal order. Map these to your Code step instead of just Title/Quantity.
R’s formula machinery canonicalizes interaction labels by sorting the names inside :. So b:a and a:b are the same term, and when you pass the terms object to model.matrix() it will print the canonical label (usually a:b) regardless of the order you wrote in the formula—even with keep.order = TRUE (which only controls the order of terms, not the order of variables within an interaction).
You can verify they’re identical:
dd <- data.frame(a = 1:3, b = 1:3)
mm1 <- model.matrix(terms(~ a + b + b:a, keep.order = TRUE), dd) mm2 <- model.matrix(terms(~ a + b + a:b, keep.order = TRUE), dd)
all.equal(mm1, mm2)
If you absolutely need the printed column name to match your original b:a, just rename after creation:
mm <- model.matrix(terms(~ a + b + b:a, keep.order = TRUE), dd) colnames(mm) <- sub("^a:b$", "b:a", colnames(mm)) mm
(For wider cases you could write a small renamer that maps any x:y to your preferred order.)
So the behavior you’re seeing is expected: terms()/model.matrix() normalize interaction labels, and there’s no option to keep b:a other than renaming the columns post hoc.
More practical R tips & design-matrix gotchas: [https://rprogrammingbooks.com]
I'll answer the question in the title which doesn't really match the question content, just for the benefit of those googling and ending up here:
This tool is very useful for showing a dependency tree for any project: https://github.com/marss19/reference-conflicts-analyzer
VS2022 extension link: https://marketplace.visualstudio.com/items?itemName=MykolaTarasyuk.ReferenceConflictsAnalyserVS2022
Short answer: within a single database, the secondary will apply changes in the same commit/LSN order as the primary, but a readable secondary can lag—so your query might not see the most recent commits yet. There’s no cross-database ordering guarantee.
Synchronous commit: a primary commit isn’t acknowledged until the log block is hardened on the synchronous secondary. This preserves commit order, but the secondary still has to redo those log records before they’re visible to reads, so you can be milliseconds–seconds behind.
Asynchronous commit: the secondary can lag arbitrarily; visibility is eventually consistent, but the redo still follows LSN order.
Readable secondaries use snapshot isolation, so any single query sees a transactionally consistent point-in-time view up to the last redone LSN; it won’t see “reordered” data, just possibly older data.
Parallel redo (newer versions) replays independent transactions concurrently but preserves required dependencies/ordering; waits occur if one record must be redone before another.
If you absolutely require up-to-the-latest, strictly ordered visibility for consumers, read from the primary (or gate reads on the secondary until it has redone to the LSN/commit time you require).
I recently found that when returning std::pair from a function, extra move is needed compared to an aggregate struct. See this
https://godbolt.org/z/b63K9bzfs
All f1(), f2(), f3() will call constructor twice since we are constructing two new A objects. But for f1(), the objects are constructed directly on the caller. For f3(), two extra moves are called. I don't know how to optimize those away.
var isClickHandlerBlocked = false; 
async function toDo(){
    if(isClickHandlerBlocked)
        return; 
    isClickHandlerBlocked = true;
    //await do something
    isClickHandlerBlocked = false;
}
easy way
Does not prevent the click event but it prevents the action from being executed.
It is not strictly impossible to use SNOPT 7.7, but the Drake build system (patch system, expectations) is tightly coupled to earlier versions, so you will have to do nontrivial patch adaptation
It seems you are using a card or a contact reader that only supports T=0, using an implementation of javax.smartcardio that doesn't support extended length over T=0.
In your first example, you ask to connect in any protocol, and the card and the reader have agreed on T=0. Upon sending your extended C-APDU, the implementation fails because it does not support sending extended APDU over T=0.
In your second example, you force usage of T=1, however either the card or the reader doesn't support T=0.
Have you checked your reader doesn't have known bugs? https://ccid.apdu.fr/ccid/section.html
Have you checked the card ATR to see if it is configured for dual T=1/T=0? https://smartcard-atr.apdu.fr/
Does your card have a contactless interface? If you have a contactless reader, extended length is nearly always supported by those readers.
It's not a best solution but for millions of files, i added Expire current versions of objects lifecycle with 1 day configuration.
So it deleted after 1 day!
As far as I know, this is a limitation with LangFuse. To get better traces, you can define custom spans, but that's a chore depending on how much you need it.
I have a solution, you don't need to use ref for reactive, instead use shallowref, triggerref and markraw, i created a composable with all the google maps options, please test it, i'm from chile so sorry but i use spanish but you can understand the logic, google maps can't create advanced markers if the propierties are reactive.
import { shallowRef, onUnmounted, triggerRef, markRaw } from 'vue';
export default function useMapaComposable() {
  // ✅ Estado del mapa - usar shallowRef para objetos externos
  const mapa = shallowRef(null);
  const googleMaps = shallowRef(null);
  const isLoaded = shallowRef(false);
  const isLoading = shallowRef(false);
  // Colecciones de elementos del mapa - usando shallowRef para Maps
  const marcadores = shallowRef(new Map());
  const polilineas = shallowRef(new Map());
  const circulos = shallowRef(new Map());
  const poligonos = shallowRef(new Map());
  const infoWindows = shallowRef(new Map());
  const listeners = shallowRef(new Map());
  /**
   * Cargar la API de Google Maps con loading=async
   */
  const cargarGoogleMapsAPI = apiToken => {
    return new Promise((resolve, reject) => {
      // Si ya está cargado, resolver inmediatamente
      if (window.google && window.google.maps) {
        // ✅ Usar markRaw para evitar reactividad profunda
        googleMaps.value = markRaw(window.google.maps);
        isLoaded.value = true;
        resolve(window.google.maps);
        return;
      }
      // Si ya está en proceso de carga, esperar
      if (isLoading.value) {
        const checkLoaded = setInterval(() => {
          if (isLoaded.value) {
            clearInterval(checkLoaded);
            resolve(window.google.maps);
          }
        }, 100);
        return;
      }
      isLoading.value = true;
      // Crear callback global único
      const callbackName = `__googleMapsCallback_${Date.now()}`;
      window[callbackName] = () => {
        // ✅ Usar markRaw para evitar reactividad profunda
        googleMaps.value = markRaw(window.google.maps);
        isLoaded.value = true;
        isLoading.value = false;
        // Limpiar callback
        delete window[callbackName];
        resolve(window.google.maps);
      };
      const script = document.createElement('script');
      script.src = `https://maps.googleapis.com/maps/api/js?key=${apiToken}&libraries=marker,places,geometry&loading=async&callback=${callbackName}`;
      script.async = true;
      script.defer = true;
      script.onerror = () => {
        isLoading.value = false;
        delete window[callbackName];
        reject(new Error('Error al cargar Google Maps API'));
      };
      document.head.appendChild(script);
    });
  };
  /**
   * Inicializar el mapa
   */
  const inicializarMapa = async (apiToken, divElement, opciones = {}) => {
    try {
      await cargarGoogleMapsAPI(apiToken);
      const opcionesDefault = {
        center: { lat: -33.4489, lng: -70.6693 },
        zoom: 12,
        mapTypeId: googleMaps.value.MapTypeId.ROADMAP,
        streetViewControl: true,
        mapTypeControl: true,
        fullscreenControl: true,
        zoomControl: true,
        gestureHandling: 'greedy',
        backgroundColor: '#e5e3df',
        ...opciones,
      };
      if (!opcionesDefault.mapId) {
        console.warn(
          '⚠️ No se proporcionó mapId. Los marcadores avanzados no funcionarán.'
        );
      }
      // ✅ Crear el mapa y marcarlo como no reactivo
      const mapaInstance = new googleMaps.value.Map(
        divElement,
        opcionesDefault
      );
      mapa.value = markRaw(mapaInstance);
      // Esperar a que el mapa esté completamente renderizado
      await new Promise(resolve => {
        googleMaps.value.event.addListenerOnce(
          mapa.value,
          'tilesloaded',
          resolve
        );
      });
      // Agregar delay adicional para asegurar renderizado completo
      await new Promise(resolve => setTimeout(resolve, 300));
      // Forzar resize para asegurar que todo esté visible
      googleMaps.value.event.trigger(mapa.value, 'resize');
      // Recentrar después del resize
      mapa.value.setCenter(opcionesDefault.center);
      console.log('✅ Mapa completamente inicializado y listo');
      return mapa.value;
    } catch (error) {
      console.error('Error al inicializar el mapa:', error);
      throw error;
    }
  };
  // ==================== MARCADORES ====================
  const crearMarcador = (id, opciones = {}) => {
    if (!mapa.value || !googleMaps.value) {
      console.error('El mapa no está inicializado');
      return null;
    }
    const opcionesDefault = {
      position: { lat: -33.4489, lng: -70.6693 },
      map: mapa.value,
      title: '',
      draggable: false,
      animation: null,
      icon: null,
      label: null,
      ...opciones,
    };
    // ✅ Marcar el marcador como no reactivo
    const marcador = markRaw(new googleMaps.value.Marker(opcionesDefault));
    marcadores.value.set(id, marcador);
    triggerRef(marcadores);
    return marcador;
  };
  const crearMarcadorAvanzado = async (id, opciones = {}) => {
    if (!mapa.value || !googleMaps.value) {
      console.error('❌ El mapa no está inicializado');
      return null;
    }
    const mapId = mapa.value.get('mapId');
    if (!mapId) {
      console.error(
        '❌ Error: Se requiere un mapId para crear marcadores avanzados'
      );
      console.error('💡 Solución: Pasa mapId al inicializar el mapa');
      return null;
    }
    try {
      // Importar las librerías necesarias
      const { AdvancedMarkerElement, PinElement } =
        await googleMaps.value.importLibrary('marker');
      const { pinConfig, ...opcionesLimpias } = opciones;
      // Configurar opciones por defecto
      const opcionesDefault = {
        map: mapa.value, // ✅ Ahora funciona porque mapa es markRaw
        position: { lat: -33.4489, lng: -70.6693 },
        title: '',
        gmpDraggable: false,
        ...opcionesLimpias,
      };
      // Si no se proporciona contenido personalizado, crear un PinElement
      if (!opcionesDefault.content) {
        const pinConfigDefault = {
          background: '#EA4335',
          borderColor: '#FFFFFF',
          glyphColor: '#FFFFFF',
          scale: 1.5,
          ...pinConfig,
        };
        const pin = new PinElement(pinConfigDefault);
        opcionesDefault.content = pin.element;
      }
      // ✅ Crear el marcador y marcarlo como no reactivo
      const marcador = markRaw(new AdvancedMarkerElement(opcionesDefault));
      // Guardar referencia
      marcadores.value.set(id, marcador);
      triggerRef(marcadores);
      console.log('✅ Marcador avanzado creado:', id, opcionesDefault.position);
      return marcador;
    } catch (error) {
      console.error('❌ Error al crear marcador avanzado:', error);
      console.error('📝 Detalles:', error.message);
      return null;
    }
  };
  const obtenerMarcador = id => {
    return marcadores.value.get(id);
  };
  const eliminarMarcador = id => {
    const marcador = marcadores.value.get(id);
    if (!marcador) {
      return false;
    }
    // Limpiar listeners
    const elementListeners = listeners.value.get(id);
    if (elementListeners) {
      elementListeners.forEach(listener => {
        googleMaps.value.event.removeListener(listener);
      });
      listeners.value.delete(id);
      triggerRef(listeners);
    }
    // Remover del mapa
    if (marcador.setMap) {
      marcador.setMap(null);
    }
    // Para marcadores avanzados
    if (marcador.map !== undefined) {
      marcador.map = null;
    }
    // Eliminar referencia y forzar reactividad
    marcadores.value.delete(id);
    triggerRef(marcadores);
    return true;
  };
  const eliminarTodosMarcadores = () => {
    marcadores.value.forEach((marcador, id) => {
      // Limpiar listeners
      const elementListeners = listeners.value.get(id);
      if (elementListeners) {
        elementListeners.forEach(listener => {
          googleMaps.value.event.removeListener(listener);
        });
        listeners.value.delete(id);
      }
      // Remover del mapa
      if (marcador.setMap) {
        marcador.setMap(null);
      }
      // Para marcadores avanzados
      if (marcador.map !== undefined) {
        marcador.map = null;
      }
    });
    // Limpiar colecciones
    marcadores.value.clear();
    listeners.value.clear();
    // Forzar reactividad
    triggerRef(marcadores);
    triggerRef(listeners);
  };
  const animarMarcador = (id, animacion = 'BOUNCE') => {
    const marcador = marcadores.value.get(id);
    if (marcador && marcador.setAnimation) {
      const animationType =
        animacion === 'BOUNCE'
          ? googleMaps.value.Animation.BOUNCE
          : googleMaps.value.Animation.DROP;
      marcador.setAnimation(animationType);
      if (animacion === 'BOUNCE') {
        setTimeout(() => {
          if (marcadores.value.has(id)) {
            marcador.setAnimation(null);
          }
        }, 2000);
      }
    }
  };
  // ==================== POLILÍNEAS ====================
  const crearPolilinea = (id, coordenadas, opciones = {}) => {
    if (!mapa.value || !googleMaps.value) {
      console.error('El mapa no está inicializado');
      return null;
    }
    const opcionesDefault = {
      path: coordenadas,
      geodesic: true,
      strokeColor: '#FF0000',
      strokeOpacity: 1.0,
      strokeWeight: 3,
      map: mapa.value,
      ...opciones,
    };
    // ✅ Marcar como no reactivo
    const polilinea = markRaw(new googleMaps.value.Polyline(opcionesDefault));
    polilineas.value.set(id, polilinea);
    triggerRef(polilineas);
    return polilinea;
  };
  const actualizarPolilinea = (id, coordenadas) => {
    const polilinea = polilineas.value.get(id);
    if (polilinea) {
      polilinea.setPath(coordenadas);
      return true;
    }
    return false;
  };
  const obtenerPolilinea = id => {
    return polilineas.value.get(id);
  };
  const eliminarPolilinea = id => {
    const polilinea = polilineas.value.get(id);
    if (!polilinea) {
      return false;
    }
    const elementListeners = listeners.value.get(id);
    if (elementListeners) {
      elementListeners.forEach(listener => {
        googleMaps.value.event.removeListener(listener);
      });
      listeners.value.delete(id);
      triggerRef(listeners);
    }
    polilinea.setMap(null);
    polilineas.value.delete(id);
    triggerRef(polilineas);
    return true;
  };
  const eliminarTodasPolilineas = () => {
    polilineas.value.forEach((polilinea, id) => {
      const elementListeners = listeners.value.get(id);
      if (elementListeners) {
        elementListeners.forEach(listener => {
          googleMaps.value.event.removeListener(listener);
        });
        listeners.value.delete(id);
      }
      polilinea.setMap(null);
    });
    polilineas.value.clear();
    listeners.value.clear();
    triggerRef(polilineas);
    triggerRef(listeners);
  };
  // ==================== CÍRCULOS ====================
  const crearCirculo = (id, opciones = {}) => {
    if (!mapa.value || !googleMaps.value) {
      console.error('El mapa no está inicializado');
      return null;
    }
    const opcionesDefault = {
      center: { lat: -33.4489, lng: -70.6693 },
      radius: 1000,
      strokeColor: '#FF0000',
      strokeOpacity: 0.8,
      strokeWeight: 2,
      fillColor: '#FF0000',
      fillOpacity: 0.35,
      map: mapa.value,
      editable: false,
      draggable: false,
      ...opciones,
    };
    // ✅ Marcar como no reactivo
    const circulo = markRaw(new googleMaps.value.Circle(opcionesDefault));
    circulos.value.set(id, circulo);
    triggerRef(circulos);
    return circulo;
  };
  const obtenerCirculo = id => {
    return circulos.value.get(id);
  };
  const eliminarCirculo = id => {
    const circulo = circulos.value.get(id);
    if (!circulo) {
      return false;
    }
    const elementListeners = listeners.value.get(id);
    if (elementListeners) {
      elementListeners.forEach(listener => {
        googleMaps.value.event.removeListener(listener);
      });
      listeners.value.delete(id);
      triggerRef(listeners);
    }
    circulo.setMap(null);
    circulos.value.delete(id);
    triggerRef(circulos);
    return true;
  };
  const eliminarTodosCirculos = () => {
    circulos.value.forEach((circulo, id) => {
      const elementListeners = listeners.value.get(id);
      if (elementListeners) {
        elementListeners.forEach(listener => {
          googleMaps.value.event.removeListener(listener);
        });
        listeners.value.delete(id);
      }
      circulo.setMap(null);
    });
    circulos.value.clear();
    listeners.value.clear();
    triggerRef(circulos);
    triggerRef(listeners);
  };
  // ==================== POLÍGONOS ====================
  const crearPoligono = (id, coordenadas, opciones = {}) => {
    if (!mapa.value || !googleMaps.value) {
      console.error('El mapa no está inicializado');
      return null;
    }
    const opcionesDefault = {
      paths: coordenadas,
      strokeColor: '#FF0000',
      strokeOpacity: 0.8,
      strokeWeight: 2,
      fillColor: '#FF0000',
      fillOpacity: 0.35,
      map: mapa.value,
      editable: false,
      draggable: false,
      ...opciones,
    };
    // ✅ Marcar como no reactivo
    const poligono = markRaw(new googleMaps.value.Polygon(opcionesDefault));
    poligonos.value.set(id, poligono);
    triggerRef(poligonos);
    return poligono;
  };
  const obtenerPoligono = id => {
    return poligonos.value.get(id);
  };
  const eliminarPoligono = id => {
    const poligono = poligonos.value.get(id);
    if (!poligono) {
      return false;
    }
    const elementListeners = listeners.value.get(id);
    if (elementListeners) {
      elementListeners.forEach(listener => {
        googleMaps.value.event.removeListener(listener);
      });
      listeners.value.delete(id);
      triggerRef(listeners);
    }
    poligono.setMap(null);
    poligonos.value.delete(id);
    triggerRef(poligonos);
    return true;
  };
  const eliminarTodosPoligonos = () => {
    poligonos.value.forEach((poligono, id) => {
      const elementListeners = listeners.value.get(id);
      if (elementListeners) {
        elementListeners.forEach(listener => {
          googleMaps.value.event.removeListener(listener);
        });
        listeners.value.delete(id);
      }
      poligono.setMap(null);
    });
    poligonos.value.clear();
    listeners.value.clear();
    triggerRef(poligonos);
    triggerRef(listeners);
  };
  // ==================== INFO WINDOWS ====================
  const crearInfoWindow = (id, opciones = {}) => {
    if (!googleMaps.value) {
      console.error('Google Maps no está cargado');
      return null;
    }
    const opcionesDefault = {
      content: '',
      position: null,
      maxWidth: 300,
      ...opciones,
    };
    // ✅ Marcar como no reactivo
    const infoWindow = markRaw(
      new googleMaps.value.InfoWindow(opcionesDefault)
    );
    infoWindows.value.set(id, infoWindow);
    triggerRef(infoWindows);
    return infoWindow;
  };
  const abrirInfoWindow = (infoWindowId, marcadorId) => {
    const infoWindow = infoWindows.value.get(infoWindowId);
    const marcador = marcadores.value.get(marcadorId);
    if (infoWindow && marcador && mapa.value) {
      infoWindow.open({
        anchor: marcador,
        map: mapa.value,
      });
      return true;
    }
    return false;
  };
  const cerrarInfoWindow = id => {
    const infoWindow = infoWindows.value.get(id);
    if (infoWindow) {
      infoWindow.close();
      return true;
    }
    return false;
  };
  const eliminarInfoWindow = id => {
    const infoWindow = infoWindows.value.get(id);
    if (!infoWindow) {
      return false;
    }
    infoWindow.close();
    infoWindows.value.delete(id);
    triggerRef(infoWindows);
    return true;
  };
  const eliminarTodosInfoWindows = () => {
    infoWindows.value.forEach(infoWindow => {
      infoWindow.close();
    });
    infoWindows.value.clear();
    triggerRef(infoWindows);
  };
  // ==================== UTILIDADES ====================
  const centrarMapa = (lat, lng, zoom = null) => {
    if (mapa.value) {
      mapa.value.setCenter({ lat, lng });
      if (zoom !== null) {
        mapa.value.setZoom(zoom);
      }
    }
  };
  const ajustarALimites = coordenadas => {
    if (!mapa.value || !googleMaps.value || coordenadas.length === 0) {
      return;
    }
    const bounds = new googleMaps.value.LatLngBounds();
    coordenadas.forEach(coord => {
      bounds.extend(coord);
    });
    mapa.value.fitBounds(bounds);
  };
  const cambiarTipoMapa = tipo => {
    if (mapa.value && googleMaps.value) {
      const tipos = {
        roadmap: googleMaps.value.MapTypeId.ROADMAP,
        satellite: googleMaps.value.MapTypeId.SATELLITE,
        hybrid: googleMaps.value.MapTypeId.HYBRID,
        terrain: googleMaps.value.MapTypeId.TERRAIN,
      };
      mapa.value.setMapTypeId(tipos[tipo] || tipos.roadmap);
    }
  };
  const obtenerCentro = () => {
    if (mapa.value) {
      const center = mapa.value.getCenter();
      return {
        lat: center.lat(),
        lng: center.lng(),
      };
    }
    return null;
  };
  const obtenerZoom = () => {
    return mapa.value ? mapa.value.getZoom() : null;
  };
  const agregarListener = (tipo, callback) => {
    if (mapa.value && googleMaps.value) {
      return googleMaps.value.event.addListener(mapa.value, tipo, callback);
    }
    return null;
  };
  const agregarListenerMarcador = (marcadorId, tipo, callback) => {
    const marcador = marcadores.value.get(marcadorId);
    if (marcador && googleMaps.value) {
      const listener = googleMaps.value.event.addListener(
        marcador,
        tipo,
        callback
      );
      if (!listeners.value.has(marcadorId)) {
        listeners.value.set(marcadorId, []);
      }
      listeners.value.get(marcadorId).push(listener);
      return listener;
    }
    return null;
  };
  const calcularDistancia = (origen, destino) => {
    if (!googleMaps.value || !googleMaps.value.geometry) {
      console.error('Geometry library no está cargada');
      return null;
    }
    const puntoOrigen = new googleMaps.value.LatLng(origen.lat, origen.lng);
    const puntoDestino = new googleMaps.value.LatLng(destino.lat, destino.lng);
    return googleMaps.value.geometry.spherical.computeDistanceBetween(
      puntoOrigen,
      puntoDestino
    );
  };
  const limpiarMapa = () => {
    eliminarTodosMarcadores();
    eliminarTodasPolilineas();
    eliminarTodosCirculos();
    eliminarTodosPoligonos();
    eliminarTodosInfoWindows();
    // Limpiar listeners restantes
    listeners.value.forEach(listener => {
      if (Array.isArray(listener)) {
        listener.forEach(l => {
          if (googleMaps.value && googleMaps.value.event) {
            googleMaps.value.event.removeListener(l);
          }
        });
      }
    });
    listeners.value.clear();
    triggerRef(listeners);
  };
  const destruirMapa = () => {
    limpiarMapa();
    mapa.value = null;
  };
  onUnmounted(() => {
    destruirMapa();
  });
  return {
    mapa,
    googleMaps,
    isLoaded,
    isLoading,
    inicializarMapa,
    crearMarcador,
    crearMarcadorAvanzado,
    obtenerMarcador,
    eliminarMarcador,
    eliminarTodosMarcadores,
    animarMarcador,
    crearPolilinea,
    actualizarPolilinea,
    obtenerPolilinea,
    eliminarPolilinea,
    eliminarTodasPolilineas,
    crearCirculo,
    obtenerCirculo,
    eliminarCirculo,
    eliminarTodosCirculos,
    crearPoligono,
    obtenerPoligono,
    eliminarPoligono,
    eliminarTodosPoligonos,
    crearInfoWindow,
    abrirInfoWindow,
    cerrarInfoWindow,
    eliminarInfoWindow,
    eliminarTodosInfoWindows,
    centrarMapa,
    ajustarALimites,
    cambiarTipoMapa,
    obtenerCentro,
    obtenerZoom,
    agregarListener,
    agregarListenerMarcador,
    calcularDistancia,
    limpiarMapa,
    destruirMapa,
    marcadores,
    polilineas,
    circulos,
    poligonos,
    infoWindows,
  };
}
Rather than figuring out absolute paths of the specific MSVC version installed, property macros can be used for this. This can be used in a build event to copy the necessary DLLs required for the address sanitizer automatically. I've come up with the following command for this:
echo "Copying ASan DLLs"
xcopy /Y /D "$(ExecutablePath.Split(';')[0])\clang_rt.asan_*.dll" "$(OutDir)"
xcopy /Y /D "$(ExecutablePath.Split(';')[0])\clang_rt.asan_*.pdb" "$(OutDir)"
As pointed out by @Brandlingo in a comment, the macro $(VCToolsInstallDir) expands to "C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\<version>". I've further found the $(ExecutablePath) macro that expands to a list of executable directories where the first directory is "C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\<version>\bin\Host<arch>\<arch>". Exactly where the correct ASan DLLs for the specific build configuration are. (Saving the hassle to add the target architecture to the path manually)
Because the $(ExecutablePath) macro contains multiple executable directories, the bin\Host<arch>\<arch> one has to be extracted from that. The list is semicolon-separated  and luckily basic .NET operations are supported on these macros, so a .Split(';')[0] gets just that first directory. (For me this is always the "...MSVC\<version>\bin\Host<arch>\<arch>" one)
If the order of executable paths in $(ExecutablePath) ever change, this bricks. If anyone can find a macro that directly expands to "C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\<version>\bin\Host<arch>\<arch>" containing only this path, please let us know. I've only found targeted paths to specific build configurations in $(VC_ExecutablePath_x86), $(VC_ExecutablePath_x64), $(VC_ExecutablePath_x86_ARM), ... no general one for it.
Thanks to @KamilCuk I ended up making this Dockerfile...
ARG OS_VERSION=12 MODE="accelerated"
FROM debian:${OS_VERSION}-slim AS build
ENV DEBIAN_FRONTEND=noninteractive
ARG MODE
COPY updater.py .
RUN apt-get update && \
    apt-get install --no-install-recommends -y \
      ccache \
      gcc \
      make \
      patchelf \
      pipx \
      python3-dev && \
    apt-get clean && rm -rf /var/lib/apt/lists/* && \
    pipx run nuitka \
      --mode=${MODE} \
      --deployment \
      --assume-yes-for-downloads \
      --python-flag=-OO \
      --output-filename=updater-linux-amd64.bin \
      updater.py
FROM gcr.io/distroless/python3-debian${OS_VERSION}:latest
COPY --from=build updater-linux-amd64.bin /opt/
ENTRYPOINT ["/opt/updater-linux-amd64.bin"]
You can do that with Shortcuts, you must be generate for each alarm set/delete etc.. and directly iOS is not giving any API for this feature. You don't have any change without Shortcuts.
I've now resolved this. User comments are correct in that the quoted warning was not what was causing the code to fail; it was actually this, further down in the output:
AttributeError: 'Engine' object has no attribute 'connection'
This seems to have been caused by an upgrade in the version of Pandas I was using; apparently the correct syntax for connections has been changed in Pandas 2.2 and up. For more details see here: Pandas to_sql to sqlite returns 'Engine' object has no attribute 'cursor'
You can also triple-click on a line to select the whole line. (at least since Xcode 16)
The cassandra connection was not closed so it led to the warnings on tomcat shutdown.
If you need to remove a method from Object in an active IRB session (for instance y, which is added by psych in IRB).
self.class.undef_method :y
Add another field that allows only one value. Set this value as the default and make sure it is unique. Also, change the widget type to 'radio'. This will prevent users from saving more than one piece of content of a given type.
The models you were testing are retired models (gemini-1.0-pro, gemini-1.5-pro-latest, gemini-1.5-flash-latest), meaning Google no longer hosts/serves those models. You should migrate to the current active models such as Gemini 2.0, Gemini 2.5 and later. Please reference this migration guide and with details of which models are retired.
Follow this Link, it helps a lot
Remove from settings.gradle:
apply from: file("../node_modules/@react-native-community/cli-platform-android/native_modules.gradle");
applyNativeModulesSettingsGradle(settings)
is a legacy autolinking hook. In React Native 0.71+, it’s obsolete — and worse, it often breaks Gradle sync
The connection issue occurred due to how the connection string was interpreted by Python 3.10.0.
CONNECTION_STRING: Final[str] = f"DRIVER={{ODBC Driver 18 for SQL Server}};SERVER=tcp:{server_url},{port};DATABASE={database_name};Encrypt=yes;TrustServerCertificate=yes;"
CONNECTION_STRING: Final[str] = (
    f"DRIVER={{ODBC Driver 18 for SQL Server}};"
    f"SERVER={host};"
    f"DATABASE={database};"
    f"Encrypt=yes;"
    f"TrustServerCertificate=no;"
    f"Connection Timeout={timeout};"
)
⚠️ Note: Do not consider changes in parameter names (like
server_url,port, etc.). The key issue lies in how the connection string is constructed, not the variable names.
You need to create a Synth object. The soundfont can be specified when creating said object.
from midi2audio import FluidSynth
fs = FluidSynth("soundfont.sf2")
fs.midi_to_audio('input.mid', 'test.wav')
Make sure your soundfont and input midi files are in the same directory.
Possible solutions
Configure maxIdleTime on the ConnectionProvider
ConnectionProvider connectionProvider = ConnectionProvider.builder("custom")
    .maxIdleTime(Duration.ofSeconds(60))
    .build();
HttpClient httpClient = HttpClient.create(connectionProvider);
WebClient webClient = WebClient.builder()
    .clientConnector(new ReactorClientHttpConnector(httpClient))
    .build();
Set Timeouts on the HttpClient
HttpClient httpClient = HttpClient.create()
    .option(ChannelOption.CONNECT_TIMEOUT_MILLIS, 10000)
    .responseTimeout(Duration.ofSeconds(60));
Disable TCP Keep-Alive
HttpClient httpClient = HttpClient.create()
    .option(ChannelOption.SO_KEEPALIVE, false);
You also might have more useful logs by changing log level for Netty
logging:
  level:
    reactor.netty.http.client: DEBUG
Finallhy found the issue. Don't know the root cause tho. VSCode is injecting in the integrated temrinal NODE_ENV=production so devDependencies are not installed. So if anybody has this issue the way to solve it is either in the integrated terminal rewrite that to development, use terminal otuside vscode or find where VSCode has that setting to inject it. I am still searching for that myself.
What you are describing is a linting issue, and eslint is the most common way to handle this for typescript today.
There's a plugin that does what you want with the lint rule i18n-json/identical-keys https://github.com/godaddy/eslint-plugin-i18n-json
You need to add a CSS module declaration to TypeScript understands CSS imports. Create a type declaration file on your project root like globals.d.ts and add declare module "*.css";.
globals.d.ts
declare module "*.css";
If this did not works, I suggest you to verify if the TypeScript version on your VS Code is the same to your TypeScript version on your project. Just open the command palette on the VS Code, type "TypeScript: Select TypeScript version", click it and select "Use Workspace Version". This is the TypeScript listed on your package.json.
This is due to a bug with somehow Apache interferring with the CLI code-server command:
EXIM4 Log analyser. A simple yet powerful script found here.
Top 10 Failure/Rejection Reasons
Top 10 Senders
Top 10 Rejected Recipients
Date Filter
Since Airflow >3.0.0 it is the CLI command:
airflow variables export <destination-filename.json>
Instead of going for the PipeTransform, which wasn't working properly for me, I ended up removing the @Type(() => YourClass) from the property and added a Tranform:
import { plainToInstance } from 'class-transformer';
class YourClass {
  @IsObject()
  @Transform(({ value }) =>
    plainToInstance<NestedClass, unknown>(
      NestedClass,
      typeof value === 'string' ? JSON.parse(value) : value,
    ),
  )
  @ValidateNested()
  property: NestedClass;
}
Thanks, Snuffy! That really did help!
When you create a taxonomy field in ACF and set the “Return Value” to “Term ID,” ACF doesn’t store the ID as an integer, but as a serialized array, even if you allow only one term. Like this a:1:{i:0;s:2:"20";}
So, you have to compare value of the field to a string value of the serialized array. And fixed request in my case looks like this:
$query = new WP_Query( [
    'post_type' => 'photo',
    'meta_query' => [
    [
        'key' => 'year_start',
        'value' => 'a:1:{i:0;s:2:"20";}',
        'compare' => '='
    ]
    ]
]);
I faced same issue and still not able to fix
single column filtering, same as IS NOT NULL or IS NULL in SQL for KDB Q -
t:flip `a`b`c`d`e!flip {5?(x;0N)} each til 10
select from t where e <> 0N
a b c d e
---------
3 3   3 3
    5 5 5
7 7   7 7
      8 8
9     9 9
select from t where e = 0N
a b c d e
---------
0 0 0 0
  1
    2
4   4
      6
For Android: You should not use Preferences DataStore as it is store data as plain text & No encryption of the data. It can be easily accessed by other users and apps. Use should use EncryptedSharedPreferences with strong master keys.
For iOS: You should use Keychain Services.
Use libraries for both Android and iOS such as KotlinCrypto or Kotlinx-serialization with proper encryption implementation.
What we did was create an app dedicated to subscribing, with only one worker — we call this our Ingestion System API. It then passes the data to our Process API, which runs with multiple workers for parallel processing. Hope this helps.
This is based on @Karoly Horvath answers , tried to implement it in python code.
#Longest unique substring
st = 'abcadbcbbe'
left = 0 
max_len = 0
seen = set()
for right in range(len(st)):
    while st[right] in seen:
        seen.remove(st[left]) 
        left += 1
    seen.add(st[right])
    
    if (right - left) + 1 > max_len:
        max_len = (right - left) + 1
        start = left
print(st[start:start + max_len])
Experimented with the Kysely's Generated utility type mentioned by @zegarek in the OP comments. Looks like it is possible to make Kysely to work with temporal tables or rather with tables with autogenerated values.
The type for each table having autogenerated values must be altered and one cannot directly use the type inferred by zod. Instead the type needs to be modified. In my case all my temporal tables are following the same pattern, so I created the following wrapper
type TemporalTable<
  T extends {
    versionId: number
    validFrom: Date
    validTo: Date | null
    isCurrent: boolean
  },
> = Omit<T, 'versionId' | 'validFrom' | 'validTo' | 'isCurrent'> & {
  versionId: Generated<number>
  validFrom: Generated<Date>
  validTo: Generated<Date | null>
  isCurrent: Generated<boolean>
}
Now the type for each table is wrapped with this
const TemporalTableSchema = z.object({
  versionId: z.number(),
  someId: z.string(),
  someData: z.string(),
  validFrom: z.coerce.date(),
  validTo: z.coerce.date().optional().nullable(),
  isCurrent: z.boolean()
})
type TemporalTableSchema = TemporalTable<z.infer<typeof TemporalTableSchema>>
Now when defining the database type to give to Kysely I need to write it manually
const MyDatabase = z.object({
  table1: Table1Schema,
  temporalTable: TemporalTableSchema
})
type MyDatabase = {
  table1: z.infer<typeof Table1Schema>,
  temporalTable: TemporalTableSchema,
  // alternatively you can wrap the type of the table into the temporal type wrapper here
  anotherTemporalTable: TemporalTable<z.infer<typeof AnotherTemporalTable>>
}
So basically you need to write the type for the database by hand, wrap the necessary table types with the wrapper. You can't just simply compose the zod object for the database, infer its type and then use that type as the type for your database.
Up to 2025 XCode Version 26.0.1, there is no Keychain Sharing option in capabilities. Anyone knows why?
According to the documentation: https://docs.spring.io/spring-cloud-gateway/reference/appendix.html try to use 'trusted-proxies'
Ok, I found the problem, is about how the popover api positions the element by default on the center of the viewport using margins and insets.
I've solved it reseting the second popover:
#first-popover {
  width: 300px;
}
#second-popover:popover-open {
  margin: 0;
  inset: auto;
}
#second-popover {
  position-area: top;
}
<button id="open-1" popovertarget="first-popover">Open first popover</button>
<div id="first-popover" popover>
   <button id="open-2" popovertarget="second-popover">Open second popover</button>
</div>
<div id="second-popover" popover="manual">Hello world</div>