May 2025 Update
I have two .Net Frameswork 4.8 Winforms projects one which is 2 years old and one 1 week old
Microsoft-WindowsAPICodePack-Core/Shell v1.1.5 by rpastric, contre, dahall was used on the older project.
Code copied from the old project to the new one hit the runtime error that led me here.
Installed Microsoft-Windows10-APICodePack-Core/Shell v1.1.8 by rpastric, contre, dahall**, bau-global** which fixed the problem.
Microsoft-WindowsAPICodePack-Core/Shell v1.1.5 MUST be installed before Microsoft-WindowsAPI10-CodePack-Core/Shell v1.1.8 and all four packages appear to be necessssary.
I encountered the same issue. When reading an RTMP stream, a brief network interruption caused the stream to hang, and it took about 30 seconds before any frames were returned again.
To have a JSON file reporting updated IP and URL used by Microsoft and grouped by service, you can use the following url:
https://endpoints.office.com/endpoints/worldwide?clientrequestid=\[GUID\]
Just add a random GUID at the end of It.
how did you fix the problem i'm using the drain code drain from logpai for parsing logs for anomaly detection and i'm stuck in the same problem .
raw input :
081109 203615 148 INFO dfs.DataNode$PacketResponder: PacketResponder 1 for block blk_38865049064139660 terminating
example of expected output :
LineId | Date | Time | Pid | Level | Component | Content | EventId | EventTemplate | ParameterList |
---|---|---|---|---|---|---|---|---|---|
1 | 081109 | 203615 | 148 | INFO | dfs.DataNode$PacketResponder | PacketResponder 1 for block blk_38865049064139660 terminating | dc2c74b7 | PacketResponder <> for block <> terminating | ['1', 'blk_38865049064139660'] |
the error:
splitter = re.sub(" +", "\\\s+", splitters[k])
Processed 100.0% of log lines.
---------------------------------------------------------------------------
error Traceback (most recent call last)
<ipython-input-20-29577b162b2c> in <cell line: 0>()
21 log_format, indir=input_dir, outdir=output_dir, depth=depth, st=st, rex=regex
22 )
---> 23 parser.parse(log_file_all)
24
25 ## run on complete dataset
10 frames
/usr/lib/python3.11/re/_parser.py in parse_template(source, state)
1085 except KeyError:
1086 if c in ASCIILETTERS:
-> 1087 raise s.error('bad escape %s' % this, len(this)) from None
1088 lappend(this)
1089 else:
error: bad escape \s at position 0
If you're trying to upload a file with curlpp
, the PostFields
approach does not work. Instead you could use curlpp::FormParts
. This has the file attachment option. So instead of,
std::string POSTrequest = "[email protected]";
request.setOpt(new curlpp::options::PostFields(POSTrequest));
request.setOpt(new curlpp::options::PostFieldSize(POSTrequest.length()));
request.setOpt(new curlpp::options::WriteStream(&std::cout));
You would write the following,
std::string filename = "test.txt";
curlpp::Forms formParts;
formParts.push_back(new curlpp::FormParts::File("file", filename));
request.setOpt(new curlpp::Options::HttpPost(formParts));
For more information you should check out this StackOverflow post. This approach is based on that post.
Furthermore, you could also look at the official curlpp
forms example, although it doesn't demonstrate file uploading specifically.
`rememberTextFieldState()` is a helper function specifically designed for use with `BasicTextField2` in Material3. It's useful for managing simple text input state directly in the Composable without needing external state management. It uses `rememberSaveable` under the hood, so it automatically survives configuration changes.
On the other hand, `MutableStateFlow` is part of the Kotlin Flow API, and it's typically used in a `ViewModel` to hold and manage UI state in a lifecycle-aware, testable way. It's more suitable when your app needs to follow unidirectional data flow (UDF) or MVVM architecture.
**Use `rememberTextFieldState()` when:**
- You're managing state locally within a single Composable.
- You want a quick and simple setup.
**Use `MutableStateFlow` when:**
- You need to share state across multiple Composables.
- You want to persist state in a ViewModel.
- You care about separation of concerns and testability.
In practice, you could combine both: use `StateFlow` in your ViewModel and bind it to the Composable’s UI state via a state hoisting pattern.
After some debugging, I seem to have figured out the problem. The issue in the above code lies with the state parameter in reddit.auth.url(). I tried with different strings other than "..." and avoided using any special character and the request worked fine.
My current code:
_auth_url = reddit.auth.url(
scopes=["identity"],
state="authAccess",
duration="permanent"
)
Probably It has something to do with how PRAW validates the parameters at the backdrop.
Thanks!
I got into same error and in my case, the chromium-driver wasn't installed and installing it using following command just fixed the issue rightly..
sudo apt install chromium-chromedriver
The GPT-2 tokenizer is not compatible with the newer GPT-3 models (including davinci-002 and gpt-3.5-turbo). You'd have to use something like tiktoken instead.
For example:
import tiktoken
tokenizer = tiktoken.encoding_for_model('gpt-3.5-turbo')
tokenizer.encode('airplane')
This might be unrelated but what I found useful is to disable health checks in autoscaling groups. This allows the script to run even it fails and get more details on why the script failed. For me it is sometimes the file path not setup correctly in the app.
Hope it helps.
I tested this regex in php flavour. This will not match $500 since you want more than $500. This will detect either exact 2 decimal places or whole amount. This also handles case scenario where 600, 700 etc were ignored because we were ignoring 500.
\$([5-9]+\d[1-9]|[6-9]+\d{2}|[1-9]+\d{3,})(\.\d{2})?
update: i double checked my old data before down is not affected, but the new records after server up are wrong
Here's how to fix the Command 'adk' not found error:
1. Ensure the ADK is installed
If you installed it in a virtual environment, activate the environment and then reinstall:
pip install -U adk
To install it globally (not recommended for dev work):
pip install --user -U adk
---
2. Check that the adk command is in your PATH
Even if adk is installed, your shell may not know where it is. Try running:
python3 -m adk --help
If that works, it confirms that ADK is installed, but the CLI script isn’t linked in your PATH.
To find where it is:
pip show adk
Look at the "Location" field, then look for the bin folder inside it — that’s where adk should be. For example:
Location: /home/yourname/.local/lib/python3.10/site-packages
Then check:
ls /home/yourname/.local/bin
If adk is in there, add it to your PATH:
export PATH=$PATH:/home/yourname/.local/bin
To make this change permanent, add that line to your ~/.bashrc or ~/.zshrc.
---
3. Try again
After setting the PATH:
adk --help
adk web
---
in Android Studio go to:
settings -> Experimental -> (check mark) Configure all Gradle tasks during Gradle Sync
Then do a Gradle Sync and all the folders should be created in the Gradle tab on the right
File > Preferences > Settings
*
by changing the value into false.大概率是调用端参数的参数值没有转换成string,prompt和tool不太一样,要求传进参数必须是string.
这问题曾经折腾了我一天。
you are not allowed to pass a list dataStructure in your route object, in case you have a list of objects, you can define it as String and convert the list into json and pass it. and on the other screen convert json into your list of object you want and you're good to go.
User error. I tried to pass the nonexistent float4 in a texture declaration in Metal. It should just be float. Odd that I got the Toolchain missing error from that.
Server IP Reputation: Cloud/VPS providers (AWS, Azure, etc.) often have IPs flagged as suspicious or part of known data centers.
Headless mode: Even with undetected_chromedriver
, running in headless mode can increase detection likelihood.
Identifiable browser fingerprints: Subtle mismatches in JS APIs, fonts, canvas fingerprinting, or automation indicators.
Repeated queries: Frequent or patterned queries look automated.
Use residential proxies instead of datacenter IPs to mimic real users.
Rotate proxies to avoid rate-limiting.
self.chrome_options.add_argument('--proxy-server=http://user:pass@proxyhost:port')
Headless detection has improved significantly.
parser = YandexParser(USE_GUI=True)
Add mouse movements and scrolling before parsing or clicking.
Use libraries like pyautogui
or manually simulate via Selenium:
from selenium.webdriver.common.action_chains import ActionChains ActionChains(driver).move_by_offset(100, 200).perform() driver.execute_script("window.scrollTo(0, document.body.scrollHeight);")
Add random wait times between actions.
Randomize user agents:
from fake_useragent import UserAgent ua = UserAgent() self.chrome_options.add_argument(f'user-agent={ua.random}')
Integrate services like:
These services solve CAPTCHAs using human labor or AI.
time.sleep(random.uniform(10, 30))
after each session or query group.Sometimes mobile versions are less protected:
self.driver.get(f"https://m.ya.ru/search/?text={film_name}")
import undetected_chromedriver as uc
import random
from fake_useragent import UserAgent
from selenium.webdriver.common.by import By
from selenium.webdriver.common.action_chains import ActionChains
from datetime import datetime
import time
import logging
import traceback
import pathlib
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s [%(levelname)s] %(message)s",
handlers=[
logging.FileHandler("yandex_parser.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger('ParserLogger')
class YandexParser():
def __init__(self, USE_GUI=True):
self.ua = UserAgent()
self.chrome_options = uc.ChromeOptions()
if not USE_GUI:
self.chrome_options.add_argument('--headless=new') # headless=new is less detectable
self.chrome_options.add_argument('--no-sandbox')
self.chrome_options.add_argument('--disable-dev-shm-usage')
# Stealth settings
self.chrome_options.add_argument('--disable-blink-features=AutomationControlled')
self.chrome_options.add_argument("--disable-infobars")
self.chrome_options.add_argument("--disable-popup-blocking")
self.chrome_options.add_argument("--start-maximized")
self.chrome_options.add_argument(f"user-agent={self.ua.random}")
# Optional: Add proxy here if you have one
# self.chrome_options.add_argument('--proxy-server=http://user:pass@proxy_host:proxy_port')
self.driver = uc.Chrome(options=self.chrome_options)
# Hide webdriver
self.driver.execute_cdp_cmd('Page.addScriptToEvaluateOnNewDocument', {
'source': '''
Object.defineProperty(navigator, 'webdriver', {get: () => undefined});
'''
})
self.driver.get("https://ya.ru/")
self._simulate_human()
time.sleep(random.uniform(2, 4))
def _simulate_human(self):
try:
# Scroll and mouse movement
self.driver.execute_script("window.scrollTo(0, document.body.scrollHeight / 3);")
ActionChains(self.driver).move_by_offset(random.randint(5, 200), random.randint(5, 200)).perform()
time.sleep(random.uniform(0.5, 1.5))
except Exception as e:
logger.warning(f"Human simulation failed: {e}")
def close(self):
self.driver.quit()
def check_captcha(self):
cur_time = str(datetime.now()).replace(' ', '_')
if "showcaptcha" in self.driver.current_url:
logger.warning("Captcha found")
self.driver.save_screenshot(f'screens/img_captcha_{cur_time}.png')
try:
button = self.driver.find_element(By.XPATH, "//input[@class='CheckboxCaptcha-Button']")
button.click()
logger.info("Captcha checkbox clicked")
time.sleep(random.uniform(1.5, 3))
self.driver.save_screenshot(f'screens/img_captcha_afterclick_{cur_time}.png')
except Exception as e:
logger.warning("Captcha click failed.")
else:
self.driver.save_screenshot(f'screens/img_{cur_time}.png')
def parse(self, film_name: str):
logger.info(f"Start parsing: {film_name}")
result_urls = []
try:
self.driver.get(f"https://ya.ru/search/?text={film_name}&lr=213&search_source=yaru_desktop_common&search_domain=yaru")
self._simulate_human()
self.check_captcha()
for i in range(1, 5):
result_urls.extend(self.parse_page(i))
self.get_next_page()
self._simulate_human()
self.check_captcha()
time.sleep(random.uniform(2, 5))
except Exception:
logger.error(f"Exception: {traceback.format_exc()}")
finally:
logger.info(f"Found {len(result_urls)} results for '{film_name}': {result_urls}")
def parse_page(self, page_id):
res = []
try:
urls_raw = self.driver.find_elements(By.XPATH, '//a[@class="Link Link_theme_normal OrganicTitle-Link organic__url link"]')
for url_raw in urls_raw:
href = url_raw.get_attribute("href")
if href and "yabs.yandex.ru" not in href:
res.append(href)
logger.info(f"Found {len(res)} URLs on page {page_id}")
except Exception:
logger.warning(f"Could not parse page {page_id}")
return res
def get_next_page(self):
try:
next_btn = self.driver.find_elements(By.XPATH, '//div[@class="Pager-ListItem Pager-ListItem_type_next"]')
if next_btn:
next_btn[0].click()
time.sleep(random.uniform(3, 6))
except Exception as e:
logger.warning(f"Next page navigation failed: {e}")
if __name__ == "__main__":
pathlib.Path('screens/').mkdir(exist_ok=True)
parser = YandexParser(USE_GUI=True) # GUI mode for better stealth
films = ["Терминатор смотреть", "Саша Таня смотреть", "Джон Уик смотреть онлайн"]
idx = 0
try:
while True:
film = films[idx]
idx = (idx + 1) % len(films)
parser.parse(film)
time.sleep(random.uniform(8, 15))
except Exception as e:
logger.error(f"Fatal error: {e}")
finally:
parser.close()
In my case setting the --network=host was the issue & not passing --network parameter solved the issue.
ridk enable
in your terminal.pacman
and other commands should work.npm config delete proxy
npm config delete http-proxy
npm config delete https-proxy
Try This . This Is Working
@javadev How did you fix it? Can you please share your ingress config for this?
Perhaps you haven't activated venv
# Create
python -m venv .venv
# Activate (each new terminal)
# macOS/Linux: source .venv/bin/activate
# Windows CMD: .venv\Scripts\activate.bat
# Windows PowerShell: .venv\Scripts\Activate.ps1
I came across this project https://github.com/tschuehly/spring-view-component that might be close to what you are looking for. Take a look
After repeated trials and errors, I could build the full path of a struct dentry *
in an eBPF/LSM hook.
Suppose a set of directory entry(dentry) objects is constructing the path /home/knight/paff.txt
. In this case, the directory entry objects will be connected as below:
paff.txt (dentry object store the current string (e.g., "path.txt", "knight")
↓ (dentry has "struct dentry * d_parent" indicating the parent)
knight
↓
home ←---*
| | (if the current dentry is root, then current == current->d_parent)
*-------*
So, in the Linux kernel, when it has to construct the whole path string like /home/knight/paff.txt
, it calls the dentry_path_raw()
function, which is defined at fs/d_path.c
. dentry_path_raw()
calls __dentry_path()
, which is defined as below.
/*
* Write full pathname from the root of the filesystem into the buffer.
*/
static char *__dentry_path(const struct dentry *d, struct prepend_buffer *p)
{
const struct dentry *dentry;
struct prepend_buffer b;
int seq = 0;
rcu_read_lock();
restart:
dentry = d;
b = *p;
read_seqbegin_or_lock(&rename_lock, &seq);
while (!IS_ROOT(dentry)) { // repeat traverse until the current dentry is root
const struct dentry *parent = dentry->d_parent; // store its parent
prefetch(parent);
if (!prepend_name(&b, &dentry->d_name)) // prepend current dentry's name to the current buffer
break;
dentry = parent; // move to its parent
}
if (!(seq & 1))
rcu_read_unlock();
if (need_seqretry(&rename_lock, seq)) {
seq = 1;
goto restart;
}
done_seqretry(&rename_lock, seq);
if (b.len == p->len)
prepend_char(&b, '/');
return extract_string(&b);
}
In short, the given function traverses the dentry chains from the lowest (e.g., paff.txt
) to the highest (e.g., /home
). And every time it discovers new dentry, it reads the current dentry's string that constructs the full path and append it to the current buffer like below.
paff.txt (discovers a new dentry)
/paff.txt (manually append '/')
knight/paff.txt (discovers a new dentry)
/knight/paff.txt (manually append '/')
home/knight/paff.txt (discovers a new dentry)
/home/knight/paff.txt (manually append '/') ----> finished
So the algorithm of constructing the full path is not so difficult, it just need to move string pointers within a buffer. However, eBPF verifier which statistically verify the code safety stringently doesn't allow many conventionally used C practices due to potential string overflow or other errors, making the implementation itself very difficult. For example, attaching '/' in front of the current buffer, moving buffer text to the front after constructing the full path (e.g., /home/knight/paff.txt\0
-> /home/knight/paff.txt\0
) -- everything was a problem and I couldn't tackle these things at least in my capability.
Since my eBPF development environment was a combination of C(eBPF/LSM) + Go(Cilium package), I used the following approach.
Discover a new dentry and read its name (e.g., paff.txt
) using BPF CORE read function.
Send its name to the user space (to Go program) immediately, because Golang provides easy and safe methods to manipulate the string without the harsh eBPF verifier.
Until the current dentry chain doesn't have nothing new, send all dentry's names to the user space like ["paff.txt", "knight", "home"]
, then, construct the full path. Done.
They may not 'desirable', because it sends the data from kernel to user spaces repeatedly for a single full path construction. However, considering its relative occurrence rarity than file open or file read, I think additional overhead caused by this approach is acceptable. Additionally, it is easy to implement and there is no need to wrestle with eBPF verifier, too.
I want to share my struggle and progress to someone who may confront this issue in the future, you can find my full implementation in C and Go at the following links.
Thanks for reading!
I feel incredibly dumb for spending hours on this issue, but this error is caused by having an empty enum file. For my case specifically, PRODUCT_TYPE didn't have any enums.
The Second Buffered Reader sees the stream as empty because the First Buffered Reader read the underlying stream to EOF when filling the buffer. There is nothing remaining to read.
All of your assumptions are true.
I got this because. I didn't put my index.html
. The index.html file path will be
resources/static/index.html
header 1 | header 2 |
---|---|
cell 1 | cell 2 |
cell 3 | cell 4 |
Assssssssssasin
This feature is really important. Hopefully it’ll be supported soon to help make the app better for users
I'm working with Apache POI 5.4.1
protected void init(final HSSFWorkbook workbook) {
if (workbook == null) return;
final HSSFFont font = workbook.getFontAt(0);
font.setFontName("Calibri");
font.setCharSet(Font.DEFAULT_CHARSET);
}
I presume Font at zero is default Body style font.
In the Room Persistence Library, to make a primary key auto-increment, we use the annotation @PrimaryKey(autoGenerate = true)
. This tells Room that the field should automatically increase its value each time a new record is inserted into the database.
Example:
java
CopyEdit
@Entitypublic class Student { @PrimaryKey(autoGenerate = true) public int id; public String name; public int age; }
In the example above, the id
field is the primary key, and autoGenerate = true
ensures that Room will auto-increment it for each new Student
entry.
Conclusion:
To make a primary key auto-increment in Room, use @PrimaryKey(autoGenerate = true)
.
I created one online AutCAD SHX font viewer. You don't need to install any software and just open the following link.
[https://mlight-lee.github.io/shx-parser/](https://mlight-lee.github.io/shx-parser/)
Moreover, source code is available in here.
[https://github.com/mlight-lee/shx-parser](https://mlight-lee.github.io/shx-parser/)
Have you tried?
pip install strip-hints hatch hatchling
Although this entry is a bit dated, a simple yet powerful solution remains available: the Data Source for Contact Form 7 plugin. This plugin allows you to populate form fields dynamically with data from a database or other sources. For instance, if your URL includes a product ID and you wish to display the corresponding product title in a form field, you can utilize a recordset tag within your form. The recordset tag fetches the product information from the database, filtering by the "product" URL parameter, while a recordset-field-link field then populates the desired form field with the retrieved data. The hypothetical CF7 form structure would be:
<label>Post title</label>: [text title]
[cf7-recordset id="product-data" type="database" query="SELECT post_title FROM {wpdb.posts} WHERE ID={var.url.product}"]
[cf7-link-field recordset="product-data" field="title" value="post_title"]
I've had a lot of good experiences using: github.com/AngusJohnson/Clipper2
I'ts very easy to get your 2d polygon data into it, excellent OSS license, lots of operations besides, union and difference, and very fast, much faster than GPC, even when using large point sets.
I went into the Repositories view, and deleted all the previous local repos attached to this project (after confirming the most recent repo (imported from Github) contained all the project Commit history.
Still that wasn't enough.
I went into the MacOS file system, and renamed all the old project-level folders (in the folder named "git") to block Eclipse from accessing them.
Then I quit Eclipse and restarted, choosing the same Workspace I've been using for years.
THIS SEEMS TO HAVE SOLVED THE PROBLEM.
Apparently, Eclipse was still reaching into one of the older repos?? I'll never know.
Your issue stems from LinkedIn’s restricted access to certain API permissions, especially r_liteprofile
and w_member_social
, which require your app to be approved for specific LinkedIn products—not just implementing the OAuth flow correctly.
convert_to_python_script/<write_code>Point_a_start_1_559_308_2869/look_for_/lat_36.310218/lon_119.284444/type_tower_32.0/ping_to+1_559_308_2869/rebound_with_coordinates_to_1_559_308_2869_text_message_coordinatesdinates_of_+1_559_308_2869/end_transmission/ates Read more: https://www.city-data.com/search/?cx=012585674615115756003%3Awego5ffcspg&cof=FORID%3A10&ie=UTF-8&q=Oak%20and%20santa_fe&f=cs&sa=Search&siteurl=Point_a_start_1_559_308_2869/look_for\_/lat_36.310218/lon_119.284444/type_tower_32.0/ping_to+1_559_308_2869/rebound_with_coordinates_to_1_559_308_2869_text_message_coordinates_of\_+1_559_308_2869/end_transmission/
Was this ever solved, Baba? I am doing the same type of upgrade and am about to re-write my custom attributes as well.
I also have this issue with jhipster 8. Do you find a workaround for this?
https://ameblo.jp/tenshoku-gokui/entry-12850431845.html You may want to try the color.xml that can be downloaded from the above site. Change "default background" in the dreamweaver settings to "#2D2A2E".
Thank you to everyone for the interesting visualizations, it helped me learn a lot. I realized that my original idea for a 'clock-face' creates a confusing and misleading visual when you have activities that span midnight for two nights in a row. I switched instead to a semicircular or fan-like shape instead.
A segment graph also works well in terms of being able to most easily compare days so that you can detect patterns in when the baby is most commonly awake/asleep, while the spiral track most elegantly deals with the issue of activities crossing midnight.
library(ggplot2)
library(lubridate)
library(tidyverse)
library(googlesheets4) #https://googlesheets4.tidyverse.org/
library(aptheme)
library(ggpubr) # allows us to use ggarrange
library(glue)
#Load in sample data from a public Google Sheet
gs4_deauth()
url = # insert url here
df = read_sheet(url)
# Don't want to actually share this Google Sheet so here's reproducible data for seven days:
df = structure(list(date = structure(c(1756512000, 1756512000, 1756512000,
1756512000, 1756512000, 1756512000, 1756512000, 1756512000, 1756512000,
1756598400, 1756598400, 1756598400, 1756598400, 1756598400, 1756598400,
1756598400, 1756598400, 1756684800, 1756684800, 1756684800, 1756684800,
1756684800, 1756684800, 1756684800, 1756684800, 1756684800, 1756684800,
1756771200, 1756771200, 1756771200, 1756771200, 1756771200, 1756771200,
1756771200, 1756771200, 1756771200, 1756857600, 1756857600, 1756857600,
1756857600, 1756857600, 1756857600, 1756857600, 1756857600, 1756857600,
1756857600, 1756857600, 1756944000, 1756944000, 1756944000, 1756944000,
1756944000, 1756944000, 1756944000, 1756944000, 1756944000, 1757030400,
1757030400, 1757030400, 1757030400, 1757030400, 1757030400, 1757030400,
1757030400, 1757030400), class = c("POSIXct", "POSIXt"), tzone = "UTC"),
activity = c("sleep", "feed", "sleep", "feed", "sleep", "feed",
"sleep", "feed", "sleep", "sleep", "sleep", "feed", "sleep",
"feed", "sleep", "feed", "sleep", "sleep", "feed", "feed",
"sleep", "feed", "sleep", "feed", "sleep", "feed", "sleep",
"sleep", "feed", "sleep", "feed", "sleep", "feed", "sleep",
"feed", "sleep", "sleep", "feed", "sleep", "feed", "sleep",
"feed", "sleep", "feed", "sleep", "feed", "sleep", "feed",
"sleep", "feed", "sleep", "feed", "sleep", "feed", "sleep",
"feed", "sleep", "feed", "sleep", "feed", "sleep", "feed",
"sleep", "feed", "sleep"), start_time = structure(c(-2209158000,
-2209152600, -2209150800, -2209139100, -2209132800, -2209122000,
-2209113600, -2209096800, -2209089600, -2209152600, -2209145400,
-2209139100, -2209132200, -2209122000, -2209114800, -2209100400,
-2209095000, -2209150800, -2209148100, -2209146600, -2209132800,
-2209123800, -2209122000, -2209113000, -2209101360, -2209096800,
-2209093200, -2209158900, -2209153200, -2209150800, -2209139400,
-2209134600, -2209122900, -2209115700, -2209097400, -2209089600,
-2209157400, -2209151400, -2209149000, -2209138800, -2209132800,
-2209120200, -2209114200, -2209096800, -2209091400, -2209083900,
-2209082400, -2209153800, -2209150800, -2209140000, -2209135500,
-2209123800, -2209114800, -2209097700, -2209089600, -2209080600,
-2209157100, -2209150800, -2209149000, -2209138200, -2209132800,
-2209122600, -2209113000, -2209096200, -2209089600), class = c("POSIXct",
"POSIXt"), tzone = "UTC"), end_time = structure(c(-2209152600,
-2209152000, -2209140000, -2209138200, -2209127400, -2209120560,
-2209105200, -2209093200, -2209154400, -2209147200, -2209140000,
-2209138200, -2209131600, -2209120200, -2209107600, -2209096800,
-2209158000, -2209148100, -2209147200, -2209146000, -2209125600,
-2209122900, -2209113900, -2209110900, -2209098360, -2209095900,
-2209079700, -2209153500, -2209152300, -2209140900, -2209138200,
-2209129200, -2209121700, -2209107000, -2209094100, -2209161000,
-2209152000, -2209150800, -2209140000, -2209137900, -2209128000,
-2209119000, -2209105800, -2209093200, -2209084200, -2209083600,
-2209154400, -2209152900, -2209141800, -2209138800, -2209130100,
-2209122600, -2209107600, -2209094400, -2209082400, -2209078800,
-2209151700, -2209150200, -2209139400, -2209137000, -2209128300,
-2209120800, -2209106400, -2209093200, -2209077000), class = c("POSIXct",
"POSIXt"), tzone = "UTC")), row.names = c(NA, -65L), class = c("tbl_df",
"tbl", "data.frame"))
# Adjust so we have the correct days (rather than Google's default of December 30, 1899)
# Midnight flag marks activities that go from one day to the next
df = df %>%
mutate(midnight_flag = if_else(end_time - start_time < 0,
TRUE,
FALSE)) %>%
mutate(start_time = update(start_time, year = year(date), month = month(date), day = day(date)),
end_time = update(end_time, year = year(date), month = month(date), day = day(date))) %>%
mutate(end_time = if_else(midnight_flag,
end_time + days(1),
end_time)) %>%
mutate(activity = str_to_title(activity))
# Split up days that cross midnight into two rows
df = df %>%
split(seq_len(nrow(.))) %>% # Split dataframe by row into a list of tibbles
map_dfr(function(row) { # function to bind rows together into a dataframe again
if (row$midnight_flag) {
midnight = ceiling_date(row$start_time, unit = "day")
just_before_midnight = update(row$date, hour = 23, minute = 59, second = 59)
tibble( # Creates tibble with two rows
activity = row$activity,
date = c(row$date, midnight),
start_time = c(row$start_time, midnight),
end_time = c(just_before_midnight, row$end_time)
)
} else {
row %>% select(-midnight_flag)
}
}) %>% mutate(date = as_date(date))
# Visualize as semicircles
semicircle_graph = function(starting_date) {
midnight_start <- ymd_hms(paste(starting_date, "00:00:00"))
midnight_end <- ymd_hms(paste(starting_date + days(1), "00:00:00"))
weekday = strftime(starting_date, '%A')
df %>%
filter(date == starting_date) %>%
mutate(weekday = strftime(date, '%A')) %>%
ggplot(aes(xmin = start_time, xmax = end_time,
ymin = 0, ymax = 1, fill = activity)) +
geom_rect() +
scale_x_datetime(limits = c(midnight_start, midnight_end),
date_labels = "%l %p",
expand = c(0,0), # Comment this out and the shape will be more like a fan
breaks = seq(midnight_start, midnight_end, by = "4 hours")) +
theme_minimal(base_family = "Chalkboard") +
ggtitle(glue("{weekday}, {starting_date}")) +
ggtitle(paste0(weekday, ", ", starting_date)) +
theme(axis.text.y = element_blank(),
axis.ticks.y = element_blank(),
plot.title = element_text(hjust = 0.5, face=1),
legend.position = "bottom",
legend.title = element_blank()) +
coord_radial(start = -0.5 * pi, end = 0.5 * pi)
}
# Plot each day and save as a list
plots = map(unique(df$date), semicircle_graph)
ggarrange(plotlist = plots,
common.legend = TRUE,
legend = "bottom",
nrow = 2, ncol = 4)
When you first run airflow standalone
it generates a password for user:admin
Using WSL for windows
username@machine: head /home/username/airflow/simple_auth_manager_passwords.json.generated
The output should head a JSON password like the following
{"admin": "SomeRandomStringEWNEWNE"}
Thank you @it all makes cents
After following one of your recommended posts, I ended up at https://learn.microsoft.com/en-us/powershell/scripting/dev-cross-plat/choosing-the-right-nuget-package?view=powershell-7.5&viewFallbackFrom=powershell-7.2
Not really understanding the entire article, I realized that System.Management.Automation
and Microsoft.PowerShell.SDK
are two different implementations for more or less the same thing. So I uninstalled Microsoft.PowerShell.SDK
and followed the error messages.
Thus, I had to install:
afterwards, I again received an error for System.Identity.Tokens.JWT
: The located assembly's manifest definition does not match the assembly reference. (0x80131040)
After some more googling I realized that this package is installed implicitly into my application. This caused it to be installed as an older version. However, I didnt know Rider displays the most recent version on the right hand side and the current version in the list on the left hand side. Installing it explicitly solved the issue.
Thank you so much!
I stumbled across this. It looks like you need to put in limits for both axes: Alter axes in R with NMDS plot
Sure! If you're looking for a reliable and experienced team in the UAE, I highly recommend Solo Soft Solutions. They specialize in full IT infrastructure services including:
Fiber optic network installation
CCTV and access control systems
Audio visual integration
Wi-Fi and structured cabling
Office phone systems and smart automation
I recently worked with them for a complete setup in our office in Abu Dhabi—they provided everything from consultation to installation and ongoing support. Very professional team, competitive pricing, and everything was done on time.
📍 Based in Industrial City of Abu Dhabi, Mussafah, UAE, but they also serve Dubai, Sharjah, and other cities across the UAE. Worth checking out if you want a one-stop solution for IT and security systems.
import subprocess
# أدخل رابط ملف m3u8 هنا
m3u8_url = "https://cdn.live.easybroadcast.io/abr_corp/73_aloula_w1dqfwm/playlist_dvr.m3u8?user=sgls540839&session=ebfcbea02049a578fa06d663e18dbca3819e305f90283c5a9b7845826ac25b45517e7c8f391d493b384e97fef14771cc"
# أدخل مفتاح البث الخاص بك من Facebook Live
facebook_stream_key = "FB-122158192754578411-0-Ab2uIqC0qn0nx3xpHfO8fGcW"
# رابط البث المباشر لفيسبوك RTMPS
facebook_rtmp_url = f"rtmps://live-api-s.facebook.com:443/rtmp/FB-122158192754578411-0-Ab2uIqC0qn0nx3xpHfO8fGcW"
# أمر ffmpeg
ffmpeg_cmd = [
'ffmpeg',
'-re', # إعادة توقيت الفيديو كأنه بث مباشر
'-i', m3u8_url, # مصدر الفيديو
'-c:v', 'copy', # نسخ الفيديو بدون إعادة ترميز
'-c:a', 'aac', # تأكد أن الصوت بصيغة aac
'-f', 'flv', # الصيغة المطلوبة للبث إلى RTMP
facebook_rtmp_url
]
# تشغيل الأمر
process = subprocess.Popen(ffmpeg_cmd)
process.wait()
Best Answer and it helped me too.
IF(
HASONEVALUE('Table'[Quarter]),
CALCULATE(AVERAGE('Table'[Sales]), ALL('Table'), VALUES('Table'[Quarter])))
I wrote extension to UIImage to create a border around an image that reflects image shape. You can specify width, color and alpha for the original image.
extension UIImage {
func withOutline(width: CGFloat, color: UIColor, alpha: CGFloat = 1.0) -> UIImage? {
guard let image = addTransparentPadding(width / 2), let ciImage = CIImage(image: image) else { return nil }
let context = CIContext(options: nil)
let expandedExtent = ciImage.extent
let expandedImage = ciImage
guard let alphaMaskFilter = CIFilter(name: "CIColorMatrix") else { return nil }
alphaMaskFilter.setValue(expandedImage, forKey: kCIInputImageKey)
alphaMaskFilter.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputRVector")
alphaMaskFilter.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputGVector")
alphaMaskFilter.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputBVector")
alphaMaskFilter.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
guard let alphaImage = alphaMaskFilter.outputImage else { return nil }
guard let edgeFilter = CIFilter(name: "CIMorphologyGradient") else { return nil }
edgeFilter.setValue(alphaImage, forKey: kCIInputImageKey)
edgeFilter.setValue(width, forKey: "inputRadius")
guard let edgeMaskImage = edgeFilter.outputImage else { return nil }
guard let constantColorFilter = CIFilter(name: "CIConstantColorGenerator") else { return nil }
constantColorFilter.setValue(CIColor(color: color), forKey: kCIInputColorKey)
guard let colorImage = constantColorFilter.outputImage else { return nil }
let coloredEdgeImage = colorImage.cropped(to: expandedExtent)
guard let colorClampFilter = CIFilter(name: "CIColorClamp") else { return nil }
colorClampFilter.setValue(edgeMaskImage, forKey: kCIInputImageKey)
colorClampFilter.setValue(CIVector(x: 1, y: 1, z: 1, w: 0.0), forKey: "inputMinComponents")
colorClampFilter.setValue(CIVector(x: 1.0, y: 1.0, z: 1.0, w: 1.0), forKey: "inputMaxComponents")
guard let colorClampImage = colorClampFilter.outputImage else { return nil }
guard let sharpenFilter = CIFilter(name: "CISharpenLuminance") else { return nil }
sharpenFilter.setValue(colorClampImage, forKey: kCIInputImageKey)
sharpenFilter.setValue(10.0, forKey: "inputSharpness") // Adjust sharpness level
sharpenFilter.setValue(10.0, forKey: "inputRadius") // Adjust radius
guard let shaprenImage = sharpenFilter.outputImage else { return nil }
colorClampFilter.setValue(CIVector(x: 0.0, y: 0.0, z: 0.0, w: 0.0), forKey: "inputMinComponents")
colorClampFilter.setValue(CIVector(x: 0.0, y: 0.0, z: 0.0, w: 1.0), forKey: "inputMaxComponents")
colorClampFilter.setValue(expandedImage, forKey: kCIInputImageKey)
guard let expandedMaskImage = colorClampFilter.outputImage else { return nil }
guard let compositeFilter = CIFilter(name: "CISourceOverCompositing") else { return nil }
compositeFilter.setValue(shaprenImage, forKey: kCIInputBackgroundImageKey)
compositeFilter.setValue(expandedMaskImage, forKey: kCIInputImageKey)
guard let maskImage = compositeFilter.outputImage else { return nil }
guard let blendFilter = CIFilter(name: "CIBlendWithMask") else { return nil }
blendFilter.setValue(coloredEdgeImage, forKey: kCIInputImageKey)
blendFilter.setValue(maskImage, forKey: kCIInputMaskImageKey)
guard let outlineImage = blendFilter.outputImage else { return nil }
let rgba = [0.0, 0.0, 0.0, alpha]
guard let colorMatrix = CIFilter(name: "CIColorMatrix") else { return nil }
colorMatrix.setDefaults()
colorMatrix.setValue(expandedImage, forKey: kCIInputImageKey)
colorMatrix.setValue(CIVector(values: rgba.map { CGFloat($0) }, count: 4), forKey: "inputAVector")
guard let alphaImage = colorMatrix.outputImage else { return nil }
compositeFilter.setValue(alphaImage, forKey: kCIInputImageKey)
compositeFilter.setValue(outlineImage, forKey: kCIInputBackgroundImageKey)
guard let finalImage = compositeFilter.outputImage else { return nil }
guard let cgImage = context.createCGImage(finalImage, from: expandedExtent) else { return nil }
return UIImage(cgImage: cgImage, scale: image.scale, orientation: image.imageOrientation)
}
func addTransparentPadding(_ padding: CGFloat) -> UIImage? {
let newSize = CGSize(width: self.size.width + (2 * padding),
height: self.size.height + (2 * padding))
UIGraphicsBeginImageContextWithOptions(newSize, false, self.scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
// Ensure transparency by setting a clear background
context.clear(CGRect(origin: .zero, size: newSize))
// Corrected origin positioning
let origin = CGPoint(x: padding, y: padding)
self.draw(in: CGRect(origin: origin, size: self.size))
let paddedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return paddedImage?.withRenderingMode(.alwaysOriginal)
}
}
Usage is simple
imageView.image = myImage.withOutline(width: 5, color: .red, alpha: 1)
Apparently you can do repr[C, packed(<your_alignment>)
, which does exactly what I wanted
Now, is it your system work? If yes, please tell me details?
I had a problem with connecting mongodb atlas with powerbi:
But i flowed this video it helped me solving this problem:
https://youtu.be/v8W3lX1BLkY?si=NyG2M9aWDvxcTy9a
you may encounter a schema problem after the connection you could just search in the mongodb community you will find the solution.
thanks!
This question shows up when searching "How to clear Visual Studio cache".
I asked that question referring to "the build cache", i.e. I was seeing Build errors that seemed like wrong/outdated error.
For example, Project A referenced Project B, and Project A used BType defined in Project B. But I got an error, like the enum wasn't defined, even though it was clearly defined in Project B.
Using "Clean Solution" did not work for me, the Build errors persisted.
Instead I modified the references of Project "A", removed the "Project B" reference, and re-added it.
I don't know why that helped, I think I was using a "project" type reference before and after (vs using a "dll" reference)... But the build errors stopped.
I had a firestore.rules syntax error as the other user mentioned. I commented out some rules, but forgot to remove the && that preceded the commented out code. Spent forever trying to figure this out with Gemini...I finally googled it and this was the first link 🫠. Such an unspecific error. You would think either the firebase extension would catch it, or the emulator on startup would catch it, or that it would just straight up tell you in the error message there's an issue with your firestore rules 🙄
i Had the same issue on opencart to show a title on Portuguese language. i tried many ways but it does not help...
and then try below code and it is worked
$string = 'SEDEX em até 3 dias úteis';
html_entity_decode($string);
and then will show correct!
It looks like I found something that works for csh host shell, using concat and separating quoted and curly-braced sections of my intended command string, no more !!: nonsense, no more somethign trying to resolve the $jobRunName variable at the wrong time, this actually gives me the expected $ in my alias:
set queueNameBatch "testQueue"
set jobQcmd [ concat "sbatch -o ./logs/srun.%j.log -e ./logs/srun.%j.err " { -J \$jobRunName -p } ]
set-alias qt "$jobQcmd $queueNameBatch "
which qt
qt: aliased to sbatch -o ./logs/srun.%j.log -e ./logs/srun.%j.err -J $jobRunName -p testQueue
This works regardless of if the shell already has a setenv jobRunName already done or not. The alias seems to work, and of course gives an undefined variable error if I have not done any setenv jobRunName before running the alias, which is expected.
If I do a setenv jobRunName before running this alias then it does work as expected. Doing setenv jobRunName now can happen AFTER loading this module as long as it happens BEFORE running the alias.
There is something else I'm trying to add about having some portions appended togethre to make a path as a shell variable as it is plus a tcl variable evaluated to its value, concat seems to add a space at the joint in the middle of the output for this path that breaks that, maybe a join for that portion but I haven't got something working all the way I want for that just yet. I will update if/when I get that satisactory too.
You actually don't need a workaround.
The preview is showing when you open the androidMain/MainActivity.kt file.
Just dock this file to the right side and you can simply drag the preview panel over the whole code.
You don't really need to be able to see the MainActivity.kt code.
That way the preview panel that is linked to MainActivity.kt is just showing to the right side of your screen, while you can just create/edit your layout in the CommonMain/YourAppName.kt file
You can configure env var SPARK_WORKER_CORES
, which makes spark assume you have this number of cores regardless of what you actually have. If you are interested in other optimizations for Databricks you can check here https://zipher.cloud/
With the approach you are currently using, you would have to make it mutable. If you want to make the uuid immutable, you have to provide it during the user sign up process, along with other attributes.
I’ve resolved this issue. The problem was that Stripe couldn’t access its secret key due to environment variables not being loaded properly.
To fix this, you can create a config folder and add a loadEnv.js file inside it. In that file, load your environment variables using dotenv like this:
Then, import this file at the very top of your app.js (or wherever you need the environment variables):
This ensures your environment variables (like STRIPE_SECRET_KEY) are loaded before anything tries to use them — resolving the Stripe access issue.
Here's probably the most idiomatic Nix code for what you're asking. Noogle is your friend for finding library functions.
{
rightModules = [
"group/expand"
"wireplumber"
"network"
"privacy"
]
++ lib.optional components.bluetooth "bluetooth"
++ lib.optional components.battery "battery"
++ [
"custom/wlogout"
];
}
Solved by updating Expo to SDK 53 and React to 19.
This is happening with me with Ruby on Rails.
I opened the successfully deployed link, and then it went to a 502 bad gateway, so I deleted the railway.
did any of you manage to solve the error -103 on the speedface devices?
It's one of the Visual Studio 2022 components. This one is responsible for resolving types/symbols and providing GitHub Copilot.
From what I noticed, it's likely to use lots of memory if:
Killing it is fine - by doing so, the process will immediately restart with lower memory usage, and almost nothing will happen, except that it will crash GitHub Copilot until you restart Visual Studio.
If feasible, I'd recommend splitting one large source file into multiple classes (up to 1000 LoC each), as that's lesser symbols to resolve.
See also this answer.
Thanks to DJ Wolfson, I am using Ubuntu and I ran that command. Voila, it works!
Ubuntu/Debian: sudo apt install libcairo2-dev pkg-config python3-dev
macOS/Homebrew: brew install cairo pkg-config
Arch Linux: sudo pacman -S cairo pkgconf
Fedora: sudo dnf install cairo-devel pkg-config python3-devel
penSUSE: sudo zypper install cairo-devel pkg-config python3-devel
In the end I did some analysis of the file, and looked at existing Maya parsers out there. Some of them are very comprehensive, but ultimately what I discovered is that the Maya binary files use the IFF format -- a series of chunks, some of which are data chunks and some of which are groups of other chunks. A chunk contains a tag, a size, and potentially child tags. I was able to parse a file and find the chunk with the tag 'FINF' in it - the File Info tag - and parse that out. I stop parsing the file as soon as I find the data I am looking for, making it very fast. There is some complexity to support 32 and 64 bit modes; the width of the tag and size change from 4 to 8 bytes but the sub-tag size does not. The user must correctly align the start of each chunk on a 4 or 8 byte boundary depending on 32 or 64 bit.
I wrote my solution in Python; it's about 500 lines long.
I know links can rot, but here is a collection of resources if you want to write your own parsers:
1 .heart {
2 fill: red;
3 position: relative; top: 5px; width: 50px;
4 animation: pulse 1s ease infinite;
5 }
6 #heart {
7 position: relative; width: 100px; height: 90px;
8 text-align: center; font-size: 16px;
9}
10 @keyframes pulse {
11 0% {transform: scale(1);}
12 50% {transform: scale(1.3);}
13 100% {transform: scale(1);}
14 }
Looking at your problem, you need to dynamically adjust column widths so that the content heights match. This requires measuring the actual rendered text heights and iteratively adjusting the flex-basis values until the heights are balanced. Here's a React solution that accomplishes this:
// Variables globales
let leftFlex = 1;
let rightFlex = 1;
let isBalancing = false;
// Elementos del DOM
const leftColumn = document.getElementById('leftColumn');
const rightColumn = document.getElementById('rightColumn');
const rebalanceBtn = document.getElementById('rebalanceBtn');
const balancingIndicator = document.getElementById('balancingIndicator');
const leftFlexValue = document.getElementById('leftFlexValue');
const rightFlexValue = document.getElementById('rightFlexValue');
// Función para actualizar la visualización de los valores flex
function updateFlexDisplay() {
leftFlexValue.textContent = leftFlex.toFixed(2);
rightFlexValue.textContent = rightFlex.toFixed(2);
}
// Función para aplicar los valores flex a las columnas
function applyFlexValues() {
leftColumn.style.flex = `${leftFlex} 1 0`;
rightColumn.style.flex = `${rightFlex} 1 0`;
updateFlexDisplay();
}
// Función principal para balancear las columnas
function balanceColumns() {
if (!leftColumn || !rightColumn) {
console.error('No se encontraron las columnas');
return;
}
// Mostrar indicador de balanceado
if (!isBalancing) {
isBalancing = true;
balancingIndicator.style.display = 'inline';
rebalanceBtn.disabled = true;
rebalanceBtn.style.opacity = '0.6';
}
// Obtener las alturas actuales del contenido
const leftHeight = leftColumn.scrollHeight;
const rightHeight = rightColumn.scrollHeight;
console.log(`Alturas actuales - Izquierda: ${leftHeight}px, Derecha: ${rightHeight}px`);
// Si las alturas están lo suficientemente cerca (dentro de 10px), terminamos
if (Math.abs(leftHeight - rightHeight) <= 10) {
console.log('Columnas balanceadas exitosamente');
finishBalancing();
return;
}
// Calcular el factor de ajuste
// Si la columna izquierda es más alta, la hacemos más estrecha (aumentamos su flex)
// Si la columna derecha es más alta, la hacemos más estrecha (aumentamos su flex)
if (leftHeight > rightHeight) {
// Columna izquierda es más alta, hacerla más estrecha
leftFlex *= 1.05;
rightFlex *= 0.98;
console.log('Ajustando: columna izquierda más estrecha');
} else {
// Columna derecha es más alta, hacerla más estrecha
rightFlex *= 1.05;
leftFlex *= 0.98;
console.log('Ajustando: columna derecha más estrecha');
}
// Aplicar los nuevos valores flex
applyFlexValues();
// Continuar balanceando después de un breve retraso para permitir el re-renderizado
setTimeout(() => {
balanceColumns();
}, 50);
}
// Función para finalizar el proceso de balanceado
function finishBalancing() {
isBalancing = false;
balancingIndicator.style.display = 'none';
rebalanceBtn.disabled = false;
rebalanceBtn.style.opacity = '1';
console.log(`Balanceado completado. Valores finales - Izquierda: ${leftFlex.toFixed(2)}, Derecha: ${rightFlex.toFixed(2)}`);
}
// Función para resetear y rebalancear
function resetAndBalance() {
if (isBalancing) {
console.log('Ya se está ejecutando el balanceado');
return;
}
console.log('Iniciando rebalanceado...');
leftFlex = 1;
rightFlex = 1;
applyFlexValues();
// Iniciar el balanceado después de un breve retraso
setTimeout(() => {
balanceColumns();
}, 100);
}
// Función para manejar el redimensionamiento de la ventana
function handleResize() {
if (!isBalancing) {
console.log('Ventana redimensionada, rebalanceando...');
setTimeout(resetAndBalance, 200);
}
}
// Event listeners
document.addEventListener('DOMContentLoaded', function() {
console.log('DOM cargado, iniciando balanceado automático...');
// Aplicar valores iniciales
updateFlexDisplay();
// Iniciar balanceado automático después de que todo esté renderizado
setTimeout(() => {
balanceColumns();
}, 200);
});
// Event listener para el botón de rebalancear
rebalanceBtn.addEventListener('click', resetAndBalance);
// Event listener para redimensionamiento de ventana (con debounce)
let resizeTimeout;
window.addEventListener('resize', function() {
clearTimeout(resizeTimeout);
resizeTimeout = setTimeout(handleResize, 300);
});
// Funciones de utilidad para debugging
window.debugBalance = {
getHeights: () => ({
left: leftColumn.scrollHeight,
right: rightColumn.scrollHeight,
difference: Math.abs(leftColumn.scrollHeight - rightColumn.scrollHeight)
}),
getCurrentFlex: () => ({
left: leftFlex,
right: rightFlex
}),
forceBalance: () => resetAndBalance()
};
// Variables globales
let leftFlex = 1;
let rightFlex = 1;
let isBalancing = false;
// Elementos del DOM
const leftColumn = document.getElementById('leftColumn');
const rightColumn = document.getElementById('rightColumn');
const rebalanceBtn = document.getElementById('rebalanceBtn');
const balancingIndicator = document.getElementById('balancingIndicator');
const leftFlexValue = document.getElementById('leftFlexValue');
const rightFlexValue = document.getElementById('rightFlexValue');
// Función para actualizar la visualización de los valores flex
function updateFlexDisplay() {
leftFlexValue.textContent = leftFlex.toFixed(2);
rightFlexValue.textContent = rightFlex.toFixed(2);
}
// Función para aplicar los valores flex a las columnas
function applyFlexValues() {
leftColumn.style.flex = `${leftFlex} 1 0`;
rightColumn.style.flex = `${rightFlex} 1 0`;
updateFlexDisplay();
}
// Función principal para balancear las columnas
function balanceColumns() {
if (!leftColumn || !rightColumn) {
console.error('No se encontraron las columnas');
return;
}
// Mostrar indicador de balanceado
if (!isBalancing) {
isBalancing = true;
balancingIndicator.style.display = 'inline';
rebalanceBtn.disabled = true;
rebalanceBtn.style.opacity = '0.6';
}
// Obtener las alturas actuales del contenido
const leftHeight = leftColumn.scrollHeight;
const rightHeight = rightColumn.scrollHeight;
console.log(`Alturas actuales - Izquierda: ${leftHeight}px, Derecha: ${rightHeight}px`);
// Si las alturas están lo suficientemente cerca (dentro de 10px), terminamos
if (Math.abs(leftHeight - rightHeight) <= 10) {
console.log('Columnas balanceadas exitosamente');
finishBalancing();
return;
}
// Calcular el factor de ajuste
// Si la columna izquierda es más alta, la hacemos más estrecha (aumentamos su flex)
// Si la columna derecha es más alta, la hacemos más estrecha (aumentamos su flex)
if (leftHeight > rightHeight) {
// Columna izquierda es más alta, hacerla más estrecha
leftFlex *= 1.05;
rightFlex *= 0.98;
console.log('Ajustando: columna izquierda más estrecha');
} else {
// Columna derecha es más alta, hacerla más estrecha
rightFlex *= 1.05;
leftFlex *= 0.98;
console.log('Ajustando: columna derecha más estrecha');
}
// Aplicar los nuevos valores flex
applyFlexValues();
// Continuar balanceando después de un breve retraso para permitir el re-renderizado
setTimeout(() => {
balanceColumns();
}, 50);
}
// Función para finalizar el proceso de balanceado
function finishBalancing() {
isBalancing = false;
balancingIndicator.style.display = 'none';
rebalanceBtn.disabled = false;
rebalanceBtn.style.opacity = '1';
console.log(`Balanceado completado. Valores finales - Izquierda: ${leftFlex.toFixed(2)}, Derecha: ${rightFlex.toFixed(2)}`);
}
// Función para resetear y rebalancear
function resetAndBalance() {
if (isBalancing) {
console.log('Ya se está ejecutando el balanceado');
return;
}
console.log('Iniciando rebalanceado...');
leftFlex = 1;
rightFlex = 1;
applyFlexValues();
// Iniciar el balanceado después de un breve retraso
setTimeout(() => {
balanceColumns();
}, 100);
}
// Función para manejar el redimensionamiento de la ventana
function handleResize() {
if (!isBalancing) {
console.log('Ventana redimensionada, rebalanceando...');
setTimeout(resetAndBalance, 200);
}
}
// Event listeners
document.addEventListener('DOMContentLoaded', function() {
console.log('DOM cargado, iniciando balanceado automático...');
// Aplicar valores iniciales
updateFlexDisplay();
// Iniciar balanceado automático después de que todo esté renderizado
setTimeout(() => {
balanceColumns();
}, 200);
});
// Event listener para el botón de rebalancear
rebalanceBtn.addEventListener('click', resetAndBalance);
// Event listener para redimensionamiento de ventana (con debounce)
let resizeTimeout;
window.addEventListener('resize', function() {
clearTimeout(resizeTimeout);
resizeTimeout = setTimeout(handleResize, 300);
});
// Funciones de utilidad para debugging
window.debugBalance = {
getHeights: () => ({
left: leftColumn.scrollHeight,
right: rightColumn.scrollHeight,
difference: Math.abs(leftColumn.scrollHeight - rightColumn.scrollHeight)
}),
getCurrentFlex: () => ({
left: leftFlex,
right: rightFlex
}),
forceBalance: () => resetAndBalance()
};
<script src="https://cdnjs.cloudflare.com/ajax/libs/react/18.3.1/umd/react.production.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/react-dom/18.3.1/umd/react-dom.production.min.js"></script>
<!DOCTYPE html>
<html lang="es">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Columnas Balanceadas por Altura</title>
<link rel="stylesheet" href="styles.css">
</head>
<body>
<div class="controls">
<button id="rebalanceBtn" class="rebalance-button">
Rebalancear Columnas
</button>
<span id="balancingIndicator" class="balancing-indicator" style="display: none;">
Balanceando...
</span>
</div>
<div class="container" id="container">
<div class="column" id="leftColumn">
<h3 class="column-title">Latín</h3>
<p>Gallia est omnis divisa in partes tres, quarum unam incolunt Belgae, aliam Aquitani, tertiam qui ipsorum lingua Celtae, nostra Galli appellantur. Hi omnes lingua, institutis, legibus inter se differunt. Gallos ab Aquitanis Garumna flumen, a Belgis Matrona et Sequana dividit. Horum omnium fortissimi sunt Belgae, propterea quod a cultu atque humanitate provinciae longissime absunt, minimeque ad eos mercatores saepe commeant atque ea quae ad effeminandos animos pertinent important, proximique sunt Germanis, qui trans Rhenum incolunt, quibuscum continenter bellum gerunt. Qua de causa Helvetii quoque reliquos Gallos virtute praecedunt, quod fere cotidianis proeliis cum Germanis contendunt, cum aut suis finibus eos prohibent aut ipsi in eorum finibus bellum gerunt. Eorum una, pars, quam Gallos obtinere dictum est, initium capit a flumine Rhodano, continetur Garumna flumine, Oceano, finibus Belgarum, attingit etiam ab Sequanis et Helvetiis flumen Rhenum, vergit ad septentriones. Belgae ab extremis Galliae finibus oriuntur, pertinent ad inferiorem partem fluminis Rheni, spectant in septentrionem et orientem solem. Aquitania a Garumna flumine ad Pyrenaeos montes et eam partem Oceani quae est ad Hispaniam pertinet; spectat inter occasum solis et septentriones.</p>
<p>Apud Helvetios longe nobilissimus fuit et ditissimus Orgetorix. Is M. Messala, et P. M. Pisone consulibus regni cupiditate inductus coniurationem nobilitatis fecit et civitati persuasit ut de finibus suis cum omnibus copiis exirent: perfacile esse, cum virtute omnibus praestarent, totius Galliae imperio potiri. Id hoc facilius iis persuasit, quod undique loci natura Helvetii continentur: una ex parte flumine Rheno latissimo atque altissimo, qui agrum Helvetium a Germanis dividit; altera ex parte monte Iura altissimo, qui est inter Sequanos et Helvetios; tertia lacu Lemanno et flumine Rhodano, qui provinciam nostram ab Helvetiis dividit. His rebus fiebat ut et minus late vagarentur et minus facile finitimis bellum inferre possent; qua ex parte homines bellandi cupidi magno dolore adficiebantur. Pro multitudine autem hominum et pro gloria belli atque fortitudinis angustos se fines habere arbitrabantur, qui in longitudinem milia passuum CCXL, in latitudinem CLXXX patebant.</p>
</div>
<div class="column" id="rightColumn">
<h3 class="column-title">Inglés</h3>
<p>All Gaul is divided into three parts, one of which the Belgae inhabit, the Aquitani another, those who in their own language are called Celts, in our Gauls, the third. All these differ from each other in language, customs and laws. The river Garonne separates the Gauls from the Aquitani; the Marne and the Seine separate them from the Belgae. Of all these, the Belgae are the bravest, because they are furthest from the civilization and refinement of our Province, and merchants least frequently resort to them, and import those things which tend to effeminate the mind; and they are the nearest to the Germans, who dwell beyond the Rhine, with whom they are continually waging war; for which reason the Helvetii also surpass the rest of the Gauls in valor, as they contend with the Germans in almost daily battles, when they either repel them from their own territories, or themselves wage war on their frontiers. One part of these, which it has been said that the Gauls occupy, takes its beginning at the river Rhone; it is bounded by the river Garonne, the ocean, and the territories of the Belgae; it borders, too, on the side of the Sequani and the Helvetii, upon the river Rhine, and stretches toward the north. The Belgae rises from the extreme frontier of Gaul, extend to the lower part of the river Rhine; and look toward the north and the rising sun. Aquitania extends from the river Garonne to the Pyrenaean mountains and to that part of the ocean which is near Spain: it looks between the setting of the sun, and the north star.</p>
<p>Among the Helvetii, Orgetorix was by far the most distinguished and wealthy. He, when Marcus Messala and Marcus Piso were consuls [61 B.C.], incited by lust of sovereignty, formed a conspiracy among the nobility, and persuaded the people to go forth from their territories with all their possessions, saying that it would be very easy, since they excelled all in valor, to acquire the supremacy of the whole of Gaul. To this he the more easily persuaded them, because the Helvetii, are confined on every side by the nature of their situation; on one side by the Rhine, a very broad and deep river, which separates the Helvetian territory from the Germans; on a second side by the Jura, a very high mountain, which is situated between the Sequani and the Helvetii; on a third by the Lake of Geneva, and by the river Rhone, which separates our Province from the Helvetii. From these circumstances it resulted, that they could range less widely, and could less easily make war upon their neighbors; for which reason men fond of war as they were were affected with great regret. They thought, that considering the extent of their population, and their renown for warfare and bravery, they had narrow limits, although they extended in length 240, and in breadth 180 Roman miles.</p>
</div>
</div>
<div class="flex-values" id="flexValues">
Valores flex actuales: Latín <span id="leftFlexValue">1.00</span>, Inglés <span id="rightFlexValue">1.00</span>
</div>
<script src="script.js"></script>
</body>
</html>
This solution works by:
Measuring actual content heights using scrollHeight on the column elements Iteratively adjusting flex values - when one column is taller, its flex value is increased to make it narrower, which forces the text to wrap more and increases its height Recursive balancing that continues until the height difference is within an acceptable threshold (10px) Safety limits with min/max widths to prevent columns from becoming too narrow or wide
The key insight is that by making a column narrower (higher flex value), you force the text to wrap more, increasing its height. The algorithm finds the right balance where both columns have approximately the same content height. You can adapt this for your React app by:
Extracting the balancing logic into a custom hook Adding it to your existing layout components Adjusting the threshold and adjustment factors for your specific content Adding debouncing if you need to handle window resizing
The "Rebalance Columns" button lets you see the algorithm in action, and the flex values are displayed at the bottom so you can see how the columns are being adjusted.
with sync_playwright() as p:
proxy = get_random_proxy()
print(proxy)
browser = p.chromium.launch(headless=DEV, proxy={
"server": f"socks5://{proxy['ip']}:{proxy['port']}",
"username": proxy['username'],
"password": proxy['password']
})
Check out Bungee. It meets all your criteria (good quality, permissive license, cross platform) and there's an upgrade path to a professional SDK.
poopo peepe ahahahaha stinky ahaha
The problem is that you use `if` instead of `while` before calling `wait()`. Due to spurious wakeups, threads can wake without a `notify()`, so you must re-check the condition in a loop to avoid incorrect behavior. This causes more than the intended number of threads to run simultaneously in your code. Using a `while` loop to guard `wait()` ensures the condition is truly met before proceeding. For a detailed explanation, see this article:
In case you are using zsh
on mac, you can add the following to your .zshrc
file:
PROMPT='%F{green}%~%f %F{cyan}➜%f '
I found this website help answer my question, but after getting guided by David and Monhar Sharma here.
I’m not a coder just a person who recognizes, after a year mind u… I was hacked and I fought tooth and nail, was a victim of many I see here representing themselves as professionals but many where the tool to destroy my life. Backed by big business and gov … instead of saving my credit and car and relationship, and all police and "@Professionals “ to make me a T.I.! Destroyed my life for "the greater good” lol… I figured this all out through paper, no access to internet, alone , no computer background whatsoever put in hospitals to cover up…. But I kept and keep finding Turing and learning and finally have bags of papers and thousands of hours , lost 65 lbs and all contact to help or support f I could they where bought… SO WHO HERE WANTS TO STAND UP AND HELP!!! or is this just a game for people behind a monitor to use great skills to hurt the ones you never see?
As of Mid-2025, we are still getting 409 error logged everywhere in our logs, any idea when MS will have this fixed
niggaaa ı hate nıgas ıate aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
ol.dac_player
is initialized as part of the mtsOverlay
object when you call ol = mtsOverlay('mts.bit')
.
The full definition of mtsOverlay
is here, in the same repo.
The dac_player
is initialized as a NumPy array. NumPy arrays are initialized with a shape (corresponding to length here as the array is one-dimensional). Thus is why you can access dac_player.shape
in Cell 3.
In Cell 5 ol.dac_player[:] = np.int16(DAC_sinewave)
performs an in-place copy of the sine wave into the dac_player
. The [:]
syntax is used to replace the values of the array without creating a new one. See this StackOverflow post for more information on that syntax.
As you have not provided any source code elaborating the issue but by observing the behaviour of height, it might be a issue of screen viewport, try using height:100dvh
which will dynamically adjust viewport height
Hope it would help!
You can use froala sdk for the rich editor in python. Please checkout this repo : https://github.com/froala/wysiwyg-editor-python-sdk?tab=readme-ov-file
Alternatively, you can use JS editor like quill. https://quilljs.com/docs/quickstart
from fpdf import FPDF
import arabic_reshaper
from bidi.algorithm import get_display
class ArabicPDF(FPDF):
def header(self):
self.set_font("Arial", "B", 14)
title = get_display(arabic_reshaper.reshape("شخصية ENTP 1w9 – المُصلِح المُبتكر"))
self.cell(0, 10, title, ln=True, align="C")
def chapter_title(self, title):
self.set_font("Arial", "B", 12)
self.set_text_color(0, 102, 204)
reshaped_title = get_display(arabic_reshaper.reshape(title))
self.cell(0, 10, reshaped_title, ln=True, align="R")
self.set_text_color(0, 0, 0)
def chapter_body(self, body):
self.set_font("Arial", "", 11)
lines = body.strip().split("\\n")
for line in lines:
reshaped_line = get_display(arabic_reshaper.reshape(line.strip()))
self.cell(0, 8, reshaped_line, ln=True, align="R")
self.ln()
pdf = ArabicPDF()
pdf.add_page()
pdf.chapter_title("السمات الرئيسية:")
pdf.chapter_body("""
- مُبتكر يحترم المبادئ
- هادئ في المظهر، نشيط في الذهن
- يناقش من أجل الوضوح لا من أجل السيطرة
- توازن بين الحدس والنظام
""")
pdf.chapter_title("نقاط القوة:")
pdf.chapter_body("""
- مُصلح بطريقة إبداعية
- مقنع
- واعي ذاتيًا
- غير متحيز
- يؤمن بالتحسين المستمر
""")
pdf.chapter_title("نقاط التحدي:")
pdf.chapter_body("""
- نقد داخلي قاسٍ
- التفكير المفرط
- تأجيل المواجهات
- الإحساس بالوحدة في التغيير
""")
pdf.chapter_title("في العلاقات:")
pdf.chapter_body("""
- يحترم المساحة الشخصية
- يكره الدراما
- صديق وفيّ وناضج
- يشجع على النمو
""")
pdf.chapter_title("في العمل:")
pdf.chapter_body("""
- قائد بالفكر
- يحب المرونة مع هدف نبيل
- يرفض الروتين
- مناسب للريادة، التعليم، الإصلاح
""")
pdf.chapter_title("خارطة النمو:")
pdf.chapter_body("""
النمو الذاتي: لا تُفرط في جلد الذات
المشاعر: عبّر عنها
العلاقات: لا تنعزل
العمل: لا تقبل ما يُقيدك
التوتر: لا تدعه يتراكم
""")
pdf.chapter_title("شعارك:")
pdf.chapter_body("سأغيّر العالم، لكن أولًا... سأبدأ بتغيير فكرتي عنه.")
pdf.output("ENTP_1w9_Arabic_Profile.pdf")
From the view of the project Eclipse/Californium (Scandium), that will be a very bad idea and may end up in a denial of service. You will at least need something to filter that incoming "TOFU" handshakes, otherwise anyone may use Californium's Benchmark client to fill up your device stores very fast.
In my case it's long ago that I was up-to-date with LwM2M, but in order to have something implemented in Leshan it will be much easier if that is part of the spec.
What I implemented in Eclipse/Californium in order to an auto-provisioning is to use a specific, temporary key to establish a dtls connection, which only allows to call the "add device credential" API. The idea is to generate a key-pair and use that for a "production charge", which at the "end of line" does an functionality check and execute that auto-provisioning.
Anyway, regardless which way you go, you will need something additional in order to prevent a DoS from provisioning with "TOFU".
If you still have this problem in 2025, it looks the Vich documentation forgot to mention that the form element of the File should have
'mapped' => false,
to avoid the serialisation.
Downgrading the Vue extension to 3.0.0-alpha.2 fixed the same issue for me
Can you please check link Error when Refreshing Power BI Gateway - Mashup error
We having similar issue and we discussed this with microsoft team and they suggested using setting EnabledResultdownload=0 and we get some reports working after trying this settings. Looks something happened at network level, we used to have working reports and somehow 2 weeks back all refresh stopped working. But after using this setting some are working fine.
You are looking for something like this https://url-batch-opener.netlify.app/?
A tool for opening multiple URLs in customizable batches
FOREGROUND_SERVICE_LOCATION & DATA_SYNC: Required and correctly added for Android 14+ — ensures your foreground service is not silently blocked.
foregroundServiceType="location|dataSync": Correct — you're telling the system what types of foreground tasks this service will handle.
android:stopWithTask="true":
✅ Means the service will stop when your app's task is removed (like swiping it away from recent apps).
⚠️ This is okay if you're not trying to run location updates after the user kills the app manually.
❌ If your goal is to keep tracking even when the app is killed, this should be false and you need a strategy like WorkManager or BroadcastReceiver for BOOT and KILL handling.
Telegram can't allow to share code directly in other telegram chat "because this code was previously shared by your account" you can share in this formate like 1 2 3 4 5
Your best bet would be storing all message data you want together with the message ID into a database of some kind for every sent message and then getting the info from that database using the message ID you get from the on_raw_message_delete
event. That effectively would make your cache persist over restarts, although will use up more and more disk space.
I have made an abortable task myself: CTask
This should only be used for garbage collection of an infinitely running task and nothing else, as things like 'using' or 'try catch' just simply stop and break in the middle.
No newlines / anything at the end ? (?![\S\s])
Only alphanum 7-bit ascii ? [a-zA-Z0-9]+
put together :
^[a-zA-Z0-9]+(?![\S\s])
Has any one recently encountered this? I have been stuck with this for so long and it is so frustrating.
I am right now doing this: Create a task, and execute onTaskDispatched at the schedule time which just logs in the consol for now.
I don't get the permission_denied error, so my service account has all t he permission. I only get INVALID_ARGUMENT
This is my code snippet:
// [NOTIFICATION TASK CREATION START]
const scheduledTime = notification.scheduledTime.toDate();
const now = new Date();
// Use Cloud Tasks
const client = await getCloudTasksClient();
const parent = client.queuePath(PROJECT_ID, LOCATION, QUEUE_NAME);
const url = `https://${LOCATION}-${PROJECT_ID}.cloudfunctions.net/processScheduledNotification`;
// Calculate schedule time
const date_diff_in_seconds = (scheduledTime.getTime() - now.getTime()) / 1000;
const MAX_SCHEDULE_LIMIT = 30 * 24 * 60 * 60; // 30 days in seconds
let scheduledSeconds;
// If scheduled time is in the past or very near future (< 5 mins), schedule for 30 seconds from now
// Otherwise, schedule for the calculated time, capped at MAX_SCHEDULE_LIMIT
if (date_diff_in_seconds <= 0 || date_diff_in_seconds < 300) {
scheduledSeconds = Math.floor(Date.now() / 1000) + 30;
} else {
scheduledSeconds = Math.floor(Math.min(date_diff_in_seconds, MAX_SCHEDULE_LIMIT) + Date.now() / 1000);
}
const payload = JSON.stringify({ notificationId: notification.id })
const body = Buffer.from(payload).toString('base64')
// Create the task payload - send only the notification ID
const task = {
httpRequest: {
httpMethod: 'POST',
url,
headers: {
'Content-Type': 'application/json',
},
// Send only the notification ID in the body
body:body,
oidcToken: {
serviceAccountEmail: `{PROJECT-NUMBER}[email protected]`, // Use PROJECT_ID variable
audience: url // To my service function below
}
},
scheduleTime: {
seconds: scheduledSeconds,
nanos: 0
}
};
const [response] = await client.createTask({ parent, task });
/**
* Cloud Task handler for processing scheduled notifications
*/
export const processScheduledNotification = onTaskDispatched({
retryConfig: {
maxAttempts: 5,
minBackoffSeconds: 30,
},
queueName: 'notification-queue',
}, async (data) => {
console.log('⏲️ Received Cloud Task for scheduled notification');
console.log('Received req object:', data);
});
What am I doing wrong? Any pointers? Whats up with gprc as well? i think its correctly set up. Please help.
NET 10 might help: New in the .NET 10 SDK: Execute a .cs file
dotnet run file.cs
https://github.com/dotnet/sdk/blob/main/documentation/general/dotnet-run-file.md
You can possibly solve this problem using jQuery UI Tooltip.
Step 1:
Add this links.
1]<link rel="stylesheet"href="https://code.jquery.com/ui/1.13.2/themes/base/jquery-ui.css">
Then initialize the tooltip
$(document).ready(function(){
$(document).tooltip({
items:"tr",
content:function(){
return $(this).attr("title")
}
})
})
must have "title" attribute.
With Consul, you can have some features that using the default discovery service in Docker Swarm mode doesn't provide, they are:
cross-cluster/multi-datacenter discovery
built-in mTLS
richer health checks
ACLs
Possible scenario:
Thread A calls write(big buffer) which writes partially.
Kernel gets notified that device driver's write buffers become available.
Thread B calls write(big buffer) which writes partially too.
Thread A continues after write() return.
So the problem is not about atomicity of write() itself, but about the fact that write() and processing after write() are two different steps, not single atomic operation.