when you use the {objectname} syntax on a library it means that you are calling a specific module of that library in this case 'object', example:
example.js function:
export const hello = () => {
console.log("hello");
};
export const hello_world = () => {
console.log("hello world");
};
use example.js
import { hello } from "./example.js";
hello();
in this case we can use only function hello()
to be able to use hello_world() as well
import { hello,hello_world } from "./example.js";
hello();
hello_wolrd();
if you need more clarification or something is not clear do not hesitate to ask
add the Material widget in the parent of the Column
I know its been 12 years but Email client are some of the properties for security purpose and position is one of them. So it will work fine in your local when you see you HTML file, not when you upload it in your signature. If you inspect that into your email, you can see that property completely removed from there.
As mentioned by @Shiv_Kumar_Ganesh one of a solution is using background image instead, but you will find one more issue there that while forwarding the email if you remove any of the existing content from the email you will find your background image missing from your HTML.
If anyone knows a solution for my problem kindly revert back quickly.
if we assume address 0x100 is start of program and from 0x100 to 0x103 is instruction mov ax,5
so in memory 5 as 16-bit register is stored in 2 bytes 0x100 and 0x101 it should be stored as 00 in 0x100 and 05 in 0x101 but in little endian it is stored as 05 in 0x100 and 00 in 0x101 the reason that the least significant bit is stored in lowest address i.e. 05 in 0x100 and 00 in 0x100
to understand in detail check my medium blog about this https://medium.com/@farhanalam0407/high-big-endian-and-low-small-endian-a365a724dd0c
I was having the same issue when using Next.js 15 with turbopack, disabling turbopack fixed it for me.
Maybe some or all workers do not work with turbopack (i am not sure)
package.json
{
"scripts": {
"dev": "next dev" // remove --turbopack option
}
}
Moving the answer from the question to make it better aligned with the SO format:
It is a bug and I'm not the only one experiencing this: https://github.com/dotnet/runtime/issues/109885
How to get both id_token and access_token?
i need the access token to let the user login and the id_token to get user infos
thanks!
How to sort a list of strings based on their lengths in dart.
List<String> words = ["mass","as","hero","superhero"];
words.sort((w1, w2) => w1.length.compareTo(w2.length));
print(words);
output:
["as", "mass", 'hero", "superhero"]
Bit late to this question but for those with PHP < 7.4:
If you provide unique keys for each of the initial separate array elements, then array addition works.
Example:
const A = ["a", "aa", "aaa"];
const B = [10 => "b", 11 => "bb"];
const C = [20 => "c"];
const D = A + B + C;
Thanks for all the answers, this community is very helpful
This might save someone's day,
The error in our case was that the permissions for SSRS (SQL Server Reporting Service) were not enough, and I changed my Application Pool for IIS it was configured to ApplicationPoolIdentity. I changed my Application Pool to LocalSystem and it fixed it.
Also adding low-level try-catch helped me to identify the real error, as this error usually is not accurate and there's underlying error
Without adding additional Packages you can directly use :
from google.colab.output import eval_js
print(eval_js("google.colab.kernel.proxyPort(5000)"))
this will provide you with the link that can be accessed remotely on your laptop
I had a problem with duplicating a line WidgetsFlutterBinding.ensureInitialized(); in my main file
It has become more simpler:
async function dostuff() {const [err, res] ?= await fetch('https://codingbeautydev.com');}
Uncaught runtime errors: cannot read properties of null (reading 'useRef') typeerror :Cannot read properties of null (reading 'useRef') exports.useRef (http://localhost:3000/ static /js/bundel.js:34520:31) at BrowserRouter ih this case try o istall the npm i react-router-dom@version
Use the scientisttools python package
I removed the google_fonts from my pubspec.yaml and it worked.
did you found any solution? i happen to be stuck on the same issue.
For this case either we can use logic app or power automate both are somehow similar services.
Hope for the best.
Is the same process for external entra ID, WHen I open the licences page I can see only "This feature is unavailable or doesn't apply to the current tenant configuration" like should I have any premium subscription or the page itself not available for external entra id?
@Slaine06 how can i handle receive the voip notification in the dart code
According to the docs it doesn't matter which one you use, they're the same. https://swr.vercel.app/docs/mutation
If you're working locally, You can try a chrome extension called Allow CORS, which can solve your issue.
The samba project (samba.org) offers a compatible implementation of the active directory network protocols that Windows clients happily use as an AD server.
Not sur if it helps and serves your needs, but we have created a samba-container project ttps://github.com/samba-in-kubernetes/samba-container/ that also features an active directory server container.
pre-built images are available here: https://quay.io/repository/samba.org/samba-ad-server
sure, this is not native Windows AD, but should be compatible enough for most purposes.
Possible reasons for the discrepancy:
Returning Users: Many downloads could be from users who previously purchased the app and are simply reinstalling it. Check the user acquisition reports in Google Play Console to differentiate between new and returning users.
Google Ads Attribution: Ensure conversion tracking for purchases is correctly set up in your Google Ads campaign. Ad clicks might not always result in purchases.
Refunds or Cancellations: Check the Order Management section in Google Play Console for refunded or canceled transactions.
Technical Issues: Test the purchase flow in your app to ensure it’s functioning correctly. Use logs to identify potential errors.
Delayed Reporting: Purchases may take time to appear, depending on payment methods or regional delays.
Fraudulent Installs: Investigate unusual install patterns. Some downloads might not represent genuine user activity.
Action Steps:
We spent two days investigating with our DevOps team and eventually found out what was causing it. This breaking change https://learn.microsoft.com/en-us/dotnet/core/compatibility/containers/8.0/aspnet-port
the solution is to use
skinparam maxmessagesize 180
it affects arrow labels in state diagrams also
The "alternative" meta tag in the HTML version, which points to the plain version of the page is:
This tag is automatically detected by Lynx.
For me, ssh-add
was running the "wrong" command.
On my windows system, there were 2 ssh-add
programs - Git's one, and the OpenSSH one that is included with Windows.
Git's one requires the ssh-agent to be started manually with the command line. The OpenSSH one uses the Windows service "OpenSSH Authentication Agent".
For me, this guide https://blog.devgenius.io/how-to-add-private-ssh-key-permanently-in-windows-c9647ebfca3e got me nearly where I needed to be, but that was only part of the puzzle - the missing piece was understanding that I actually had TWO ssh agents installed, and I needed to ensure I was trying to connect to the correct one:
Type where ssh-add
to confirm which ssh-add will be invoked when you run the command:
c:\projects\keypay-dev\Basics\Payroll\Payroll>where ssh-add
C:\Windows\System32\OpenSSH\ssh-add.exe
c:\program files\Git\usr\bin\ssh-add.exe
This is how it should look if you want to use the Windows one, and thus benefit from the Windows service and not have to start it from the command line every session.
If the one in the Git folder is above, as it was for me before I corrected it, move C:\Windows\System32\OpenSSH
to higher than c:\program files\Git\usr\bin
in your PATH
variable.
Hook can be found here https://developer.wordpress.org/reference/hooks/rest_dispatch_request/
function wpse_authenticate_page_route( $dispatch_result, $request, $route, $handler ) {
if ( strpos( $route, '/wp/v2/pages' ) !== false ) {
return new \WP_Error(
'rest_auth_required',
'Authentication required',
array( 'status' => 401 )
);
}
return $dispatch_result;
}
add_filter( 'rest_dispatch_request', 'wpse_authenticate_page_route', 10, 4 );
You may want to check that blocking this route doesn't cause problems to Wordpress, they do say that blocking the API can break it. https://developer.wordpress.org/rest-api/frequently-asked-questions/#can-i-disable-the-rest-api
Add TrustServerCertificate=True; to your ConnectionString
Initially, MAS seemed like an experimental approach to AI, mainly due to the complexity of their coordination and the challenges of managing multiple agents. However, as the AI field has advanced, the real-world applications of MAS have proven to be transformative. From healthcare and autonomous vehicles to logistics, gaming, and disaster response, MAS is already solving complex problems that traditional, single-agent systems simply couldn't tackle. The benefits of MAS go beyond theoretical advantages—they are actively changing industries, driving innovation, and enhancing efficiency. With the continuous improvement in AI, communication protocols, and decentralized computing, the potential of MAS will only increase. Additionally, future integration with technologies like blockchain and edge computing will make these systems even more robust, secure, and capable of real-time decision-making. So we can say that multi-agent systems are far from being just hype. They represent a significant leap forward in AI development, with proven practical applications across industries. As AI continues to evolve, MAS will play an increasingly crucial role in shaping the future of technology and problem-solving. For more information you may find this interesting as well: https://sdh.global/blog/ai-ml/multi-agent-systems-the-future-of-collaborative-ai/#:~:text=The%20benefits%20are%20already%20being%20used%20across%20different%20industries.
Still asking myself why @ChrisF deleted my previous answer where I added the YouTube video URL to give credit to the person who wrote the code and shared it on YouTube. I will write the scripts I found that helped me solve the same problem.
// inject.js
console.clear = () => console.log('Console was cleared');
const i = setInterval(() => {
if (window.turnstile) {
clearInterval(i);
window.turnstile.render = (a, b) => {
let params = {
sitekey: b.sitekey,
pageurl: window.location.href,
data: b.cData,
pagedata: b.chlPageData,
action: b.action,
userAgent: navigator.userAgent,
json: 1,
};
// we will intercept the message
console.log('intercepted-params:' + JSON.stringify(params));
window.cfCallback = b.callback;
return;
};
}
},5);
// index.js
const { chromium } = require('playwright');
const { Solver } = require('@2captcha/captcha-solver');
const solver = new Solver('Your twocaptcha API key');
const proxyServer = 'Proxy server'; // Proxy server manager
const proxyUser = 'Proxy user';
const prpxyPassword = 'Proxy Password';
const example = async () => {
const browser = await chromium.launch({
headless: false,
devtools: false,
proxy: { "server": proxyServer, "username": proxyUser, "password": prpxyPassword },
});
const context = await browser.newContext({ ignoreHTTPSErrors: true });
const page = await context.newPage();
await page.addInitScript({ path: './inject.js' });
page.on('console', async (msg) => {
const txt = msg.text();
if (txt.includes('intercepted-params:')) {
const params = JSON.parse(txt.replace('intercepted-params:', ''));
console.log(params);
try {
console.log(`Solving the captcha...`);
const res = await solver.cloudflareTurnstile(params);
console.log(`Solved the captcha ${res.id}`);
console.log(res);
await page.evaluate((token) => {
cfCallback(token);
}, res.data);
} catch (e) {
console.log(e.err);
return process.exit();
}
} else {
return;
}
});
await page.goto('site url');
await page.waitForTimeout(5000);
await page.reload({ waitUntil: "networkidle" });
console.log('Reloaded');
};
example();
Please 1/ add a dead-letter queue to your target and set RetryPolicy to 0
so that failed attempts are immediately sent to the DLQ for further inspection. Messages sent to a DLQ have metadata attributes explaining any issues/errors.
My fix was instead of importing the error from mongodb
:
import { MongoServerError } from 'mongodb';
To import it through mongoose
:
import mongoose from 'mongoose';
if (error instanceof mongoose.mongo.MongoServerError) {
...
}
Thank you for that guys, resolved my issue straight away.
There are different methods.
Accountinfo of bonding curve account. Decode it, then get data virtual sol reserves and token reserves. Use them to calculate price. Every pumpfun token has same total supply i.e 1 billion. Price * supply = market cap
Get it from last transaction. Check swapped the sol and mint amount. Sol amount / Mint amount = price. And the same way as 1, you multiply to get marketcap.
root = tk.Tk()
root.title("My app")
root.update_idletasks()
screen_width = root.winfo_screenwidth() - (root.winfo_rootx() - root.winfo_x())
screen_height = root.winfo_screenheight() - (root.winfo_rooty() - root.winfo_y())
Hi i encountered the same issue, all i did was to delete the poetry.lock file and ran peotry install
You should remove the brackets arround the custom rule
'location' => [
'required',
'string',
'min:3',
'max:100',
new LocationIsValidRule(),
],
See https://laravel.com/docs/11.x/validation#custom-validation-rules
In my case, I load h5 file from dataloader. I think it may caused by mutilprocess loading at backend. While, we can set env to avoid file locking:
os.environ["HDF5_USE_FILE_LOCKING"] = "FALSE"
or
export HDF5_USE_FILE_LOCKING=FALSE
reference issue comment
I am trying to read a FITS file containing ROSAT data from the website (https://python4astronomers.github.io/astropy/tables.html).
Under Practical Exercises the first exercise statement: Try and find a way to make a table of the ROSAT point source catalog that contains only the RA, Dec, and count rate. Hint: you can see what methods are available on an object by typing e.g. t. and then pressing . You can also find help on a method by typing e.g. t.add_column?.
But my code: (my_env) C:\Users\labus\Documents\Curtin\Python\pyproj>ipython --matplotlib Python 3.13.1 (tags/v3.13.1:0671451, Dec 3 2024, 19:06:28) [MSC v.1942 64 bit (AMD64)] Type 'copyright', 'credits' or 'license' for more information IPython 8.30.0 -- An enhanced Interactive Python. Type '?' for help. Using matplotlib backend: tkagg
In [1]: import matplotlib.pyplot as plt ...: import numpy as np ...: import astropy ...: import tarfile ...: from urllib import request ...: from astropy.table import Table ...: from astropy.io import ascii ...:
In [2]: from astropy.table import Table, Column
In [3]: f = open('ROSAT.fits', 'r')
UnicodeDecodeError Traceback (most recent call last) Cell In[4], line 1 ----> 1 f.read()
File c:\users\labus\documents\curtin\python\pyver\python313\Lib\encodings\cp1252.py:23, in IncrementalDecoder.decode(self, input, final) 22 def decode(self, input, final=False): ---> 23 return codecs.charmap_decode(input,self.errors,decoding_table)[0]
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 18179: character maps to
In [5]: Is giving the above UnicodeDecodeError
Could someone please provide some guidance and maybe an answer as to why this problem is occurring?
Any assistance is greatly appreciated. Thank you - Cobus Labuschagne
Since i don't have enought rep to comment Im writing an answer.
// when exporting
module.exports = myDateClass
// when importing
const Date = require('./myDateClass')
const date = new Date()
I also faced print issues when running flutter app in iOS. There is a way that you can get all the log of the device using Xcode.
you should open like, Xcode -> window -> Devices and simulators -> select your device you're running -> Open console
this will open a window that will show whole log of the device and you can filter it with any keyword, even you can check keyword like contains, not-contain, equal and no-equal etc.
Redis Insight interprets all inserted characters as part of the query, including optional arguments such as INKEYS or LIMIT. To use optional arguments, you can try Workbench. It also offers syntax auto-completion for Redis Query Engine.
In Source Control you can "Show Stashes" from the menu in the REPOSITORIES section. Check the screenshot below:
Even though Google might say they will update the model on 2020-01-01 00:00:00 flat, the full rollout takes up to a week. During that time, you can get differing OCR results from run to run.
Also they sometimes change the model without notice.
Source: this is an issue I have been dealing with for years when using GOCR.
Short answer: yes
The use of java record is mentioned in the documentation about object mapping
Object creation
Spring Data automatically tries to detect a persistent entity’s constructor to be used to materialize objects of that type. The resolution algorithm works as follows:
[...]
4. If the type is a Java Record the canonical constructor is used.
[...]
Using annotations with records is not explicitly mentioned, but I was able to use annotation without any issue (spring data cassandra 4.2.x).
As it turned out, vcpkg installs libmupdf without taking care of dependencies, a pull request should have fixed the issue but wasn't merged. Linking must be done manually(find_library(...)
) for now.
In the event that someone else is stumbling upon this question while having the same issue : you need to specify the parameter "time_format", as mentionned in the documentation here :
https://docs.splunk.com/Documentation/Splunk/9.4.0/RESTREF/RESTsearch#search.2Fjobs.2Fexport
It defaults to %FT%T.%Q%:z.
In your case, if you are looking for an ISO formatting, you need to specify %Y-%m-%dT%H:%M:%S.%Q%:z
The documentation about the various time formats used by Splunk is available here : https://docs.splunk.com/Documentation/Splunk/9.4.0/SearchReference/Commontimeformatvariables
Note that this also applies to Splunk Python SDK, where you need to pass the "time_format" field as a kwargs
A workaround to fix this without changing anything else is to insert the following in your WebClient project file:
<PropertyGroup>
<_ExtraTrimmerArgs>--keep-metadata parametername</_ExtraTrimmerArgs>
</PropertyGroup>
The real issue should be fixed in https://github.com/dotnet/runtime/issues/81979
Below is the answer for removing or replacing the prefix in the dbo.
ALTER SCHEMA dbo TRANSFER [PrefixName].[salesorder]
@Bean
public ServletServerContainerFactoryBean createWebSocketContainer() {
ServletServerContainerFactoryBean container = new ServletServerContainerFactoryBean();
// 设置最大文本消息缓冲区大小
container.setMaxTextMessageBufferSize(512 * 1024);
// 设置最大二进制消息缓冲区大小
container.setMaxBinaryMessageBufferSize(512 * 1024);
// 设置异步发送超时时间
container.setAsyncSendTimeout(20000L);
// 设置会话空闲超时时间(可选)
container.setMaxSessionIdleTimeout(300000L);
return container;
}
I ran into the same problem today and managed to fix it.
I had this code in my component:
const { join } = useMeeting({ ... ... });
useEffect(() => {
join();
}, []);
It seems the problem is related to join
so I add a timeout on it.
useEffect(() => {
setTimeout(() => {
join();
}, 500);
}, []);
This is a bad solution but it did save my day.
char s[15];
float val1 = 3.1415926535;
_gcvt(val1, 10, s);
write("s: %s", s);
there is package for Android app written with Flutter but not sure for iOS. Here is it: https://pub.dev/packages/flutter_background_video_recorde
This one works perfectly for me, producing one filename per line listing, short format
ls -p . | grep -v '/'
Disagree with all the previous answers:
I would recommend manually performing all necessary validation checks before executing the persistence or database operation. For example, verify whether related entities exist or any foreign key dependencies are present. If a violation is detected, you can throw an IllegalStateException or a custom exception that clearly indicates the issue. This approach ensures that your business logic is handled explicitly in your service layer rather than relying on database constraints to handle errors.
Refference: https://stackoverflow.com/a/77125211/16993210
0.2.50 provides the "Adj. Close" column.
!pip install yfinance==0.2.50 import yfinance as yf
df = yf.download('nvda')
df.columns
MultiIndex([('Adj Close', 'NVDA'), ( 'Close', 'NVDA'), ( 'High', 'NVDA'), ( 'Low', 'NVDA'), ( 'Open', 'NVDA'), ( 'Volume', 'NVDA')], names=['Price', 'Ticker'])
How to make a correct field request in VB? All my methods of recording fields end in error...
Absolute Path:: gives you a clear direction from the beginning to the exact spot regardless where you are now (Full Address in the filesystem)
Relative Path:: it tells you how to get somewhere based on where you're already standing(your Current Working Directory is your Start point)
Well, turns out my phone being Android 10
was the issue. In accordance with Android API docs you need extra additional permissions.
https://developer.android.com/develop/connectivity/bluetooth/bt-permissions
Pretty sure is because of version mismatching. If you use numpy 1.x, downgrade pandas to 1.x; if you use numpy 2.0, using pan
To solve this Windows security-related problem
1- Open PowerShell as Administrator
2- Type Get-ExecutionPolicy - to check the current Execution Policy if it is Restricted then
3- Type Set-ExecutionPolicy RemoteSigned or Set-ExecutionPolicy RemoteSigned -Scope CurrentUser - to change the policy
4- Type y to confirm the changes.
I believe the package I recently created should resolve the issue, as it works seamlessly with both Angular 17 and Angular 18. The package is available on npm, and you can find it here: ngx-google-address-autocomplete.
The original package, ngx-google-places-autocomplete, was last updated 5 years ago, at a time when Angular was not using Ivy by default for build and compilation. To address this and ensure compatibility with newer versions of Angular, I updated the package to support Angular 12 and later versions, including Angular 17 and 18.
If you're facing similar issues with address input fields or need to implement address auto-completion in your Angular project, this updated package could be a great solution.
Let me know if you need further assistance or clarification!
Use WAYLAND
sudo apt install wl-clipboard # Debian
And in VIM
:w !wl-copy
To copy all just press gg then V then G and then execute the command
You can disable deprecation warnings from PHP to fix this issue. As @ref2 mentioned you can put this in your php.ini, or in your file using
error_reporting(E_ERROR | E_WARNING | E_PARSE | E_NOTICE);
or
error_reporting(E_ALL ^ E_DEPRECATED);
Source: https://stackoverflow.com/a/2803783/22557063
Since this is a Laravel app, I would recommend placing this in your AppServiceProvider (See docs).
If that doesn't work, you might consider placing it at the beginning of the artisan file, e.g. here. That should solve the deprecation message being displayed when running artisan commands
I believe the package I recently created should resolve the issue, as it works seamlessly with both Angular 17 and Angular 18. The package is available on npm, and you can find it here: ngx-google-address-autocomplete.
It provides an easy way to integrate Google Address Autocomplete into Angular applications. If you're facing similar issues with address input fields or need to implement address auto-completion in your Angular project, this package could be a great solution.
Let me know if you need further assistance or clarification!
Your training code may be causing high internet costs in Google Colab due to:
1.Frequent Checkpoint Saving:
Saving the checkpoint after every epoch can increase disk I/O operations and might sync with your Google Drive (if mounted), consuming bandwidth. Consider saving checkpoints less frequently, such as every 5 or 10 epochs.
2.Visualization:
Frequent visualizations, especially when training models, can use significant resources. Reduce the frequency of visualizations or save plots locally instead of displaying them.
Your role based approach would be the more general solution.
What's wrong here is that you forgot to add the created RolePermissionTypes to the RolePermissionTypeCollection in the addPredefinedRolePermissions method.
I am trying to run CefGlue on linux and it seems to not work can you help provide an example to run. Thank you.
I had similar issue resolved after installing modular with the following command. curl -sSL https://get.modular.com | sh And after that I have installed magic. curl -ssL https://magic.modular.com/7bba6c72-9d06-414c-a815-05f327c7a19g | bash Following commands worked perfectly. magic init my-project cd my-project magic add "python==3.11" magic add max
In the end I moved my validation to the ICustomTokenRequestValidator
. The validation now happens in the ValidateAsync(CustomTokenRequestValidationContext context)
. Setting context.Result.IsError = true
and populating context.Result.Error
and context.Result.ErrorDescription
causes the oidc-client-ts to throw an error during log in and I catch this in the SPA. This works for my purposes.
Unfortunately, the validation that I needed to do wasn't as easy as it was in the OnTokenValidated
event as I didn't have the necessary information (specifically I needed access to the "id_token_hint"), so it did require some "hacks" to be able to pass the necessary information to the ICustomTokenRequestValidator
just stop server then run command
watchman watch-del-all
watchman shutdown-server
I found a solution, which works alright. It works with the win32com client though, so I think, that it only works on Windows. But maybe it helps someone. It adds "number_of_rows"-rows after the "start_row"-row:
insert_empty_rows <- function(filename, sheet_name, start_row, number_of_rows){
# create an instance of Excel
excel_app <- RDCOMClient::COMCreate("Excel.Application")
# hide excel
excel_app[["Visible"]] <- FALSE
excel_app[["DisplayAlerts"]] <- FALSE
# open workbook
wb_rdcom <- excel_app$Workbooks()$Open(filename)
ws_rdcom <- wb_rdcom$Sheets(sheet_name)
# insert lines
for (. in 1:number_of_rows){
ws_rdcom$Rows(start_row + 1)$Insert()
}
# save and close workbook
wb_rdcom$Save()
wb_rdcom$Close()
excel_app$Quit()
# clean up
rm(excel_app, wb_rdcom)
wb_rdcom <- NULL
excel_app <- NULL
gc()
}
How did you resolved the issue ?
Have you installed other packages in your base
environment, other than conda and mamba ? From the mamba documentation, this may lead to issues. I had the same issue when I accidentally conda installed some packages in my base environment. You could try uninstalling and reinstalling mamba, or uninstalling and reinstalling conda completely (make sure to save your environments before if needed).
happens to me as well. It turned out it is because I am using suspend function such as
@ExecuteOn(TaskExecutors.BLOCKING)
suspend fun greet(): String
im working on a project that needs to build a report catalogue for all our reports in Cognos. I have recently gained access to Cognos Content Store (SQl server database) so have been going through the tables.. luckily i found this thread :)
The sql script posted by Micheal Singer works for our version of Cognos (v 7.7) but i just wanted to ask what exactly does the 'active =1' in the where clause mean as i was looking at a different flag for active status in the CMOBJECTS table ..
where disabled = 0 or disabled is null (to get active records)
Alos i saw mention of getting column names of each report via xml but doesnt the CMOBJPROOPS13 table give a list of all parameters / column names used in each report and the order?
I need to get the number of times each report was run, who ran it, what source its connected to, and any other pertinent information so that we can assess which reports will be nigrated to a new system. Any pointers for tables to use for this would be greatly appraciated. Is there any documentation for the available tables in content store? i cant seem to find any online (a lot of broken links).
fyi this is the sql script posted by Micahael Singer that works for us in cognos 7.7..
select ob2.cmid, c.name as classname, n.name as objectname, o.DELIVOPTIONS as
deliveryoptions, z2.name as owner
from CMOBJPROPS2 p
inner join CMOBJPROPS26 o on p.cmid=o.cmid
inner join CMOBJECTS ob on ob.cmid=o.cmid
inner join CMOBJECTS ob2 on ob.pcmid=ob2.cmid
inner join CMOBJNAMES n on n.cmid=ob2.cmid
inner join CMCLASSES c on ob2.classid=c.classid
left join CMREFNOORD2 z1 on z1.cmid = p.cmid
left join CMOBJPROPS33 z2 on z2.CMID = z1.REFCMID
where ACTIVE = 1 order by z2.name, objectName
Sorry, but the code from Black cat did not work to me. I got this:
After much trial and error I got this code to work:
Application.PrintCommunication = False
With ActiveSheet.PageSetup
.LeftHeader = ""
.CenterHeader = "&L&H&S&V&D&L&H&S"
.RightHeader = ""
End With
Application.PrintCommunication = True
It gives me
just as I want.
But I can not say I understand how it works. It would be nice to do.
For android, android:screenOrientation="portrait"
should work after you have rebuilt your project after making this change.
For iOS, in Info.plist, use
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
</array>
and then rebuild the project.
<p-multiSelect
[options]="options"
[(ngModel)]="selectedItems"
selectedItemsLabel="{0} items selected">
</p-multiSelect>
I am not sure to understand why you would like to use BIC, as it's usually a tool for model selection rather than a meaningful statistics for timeseries.
Another approach that works quite well could be to smooth your noisy signal to remove the noise (via a mooving average for instance), and remove the trend in your signal. Then use a Fourier transform and/or correlation to detect the periodicity in the signal (which should be the period at which the mean changes). From this it should be easy to approximate the means.
Here is a small example that I tested which worked quite well as a first approximation:
from scipy.fft import fft, fftfreq
n = 50
ma = np.convolve(y,np.ones(n), mode='valid')/n # denoised signal
rm_trend = y-((ma[-1]-ma[0])/len(ma)*np.arange(len(y))+ma[0]) # remove trend
corr = np.correlate(rm_trend,rm_trend,mode='full')
corr = corr[corr.shape[0]//2:]
y_fft = fft(y,norm='forward')[1:len(y)//2] # remove the mean
corr = np.correlate(rm_trend,rm_trend,mode='full') # autocorrelation
corr = corr[corr.shape[0]//2:]
freq = fftfreq(len(corr)) # frequencies
corr_fft = fft(corr,norm='forward')[1:len(corr)//2] # FFT without mean
k = 1/freq[np.argmax(corr_fft )+1]
print(k)
Please tell me if this does not answer your question.
Here's an easy approch:
A = 1 // Latin
А = 2 // Cyrillic
Α = 3 // Greek
if (A == 1 and А == 2 and Α == 3) {
console.log("Cool, it worked!")
}
Don't think 2 at the same time is possible, despite the answer above.
Is there a limit here regarding how many such services I can add?
No limit.if memory enough and you willing to handle data exchange.
Will having more services that can work also in foreground make my application too resource consuming?
Resource consuming depends on your actual code.Empty service less resources are consumed(Far lower than activity or fragment,because no ui).
As far as i know the most popular apps use only one service to manage tasks which i mentioned.
Foreground service must show a notification,so you not should tell user you is using gps,system is already working on it.
Users may think that you are wasting their phone's battery, even though you may not have done so.
so you can do,but I suggest you use a service that calls three modules. of course if you need upload or other service,you should run new service.
For mostly Structured Text code and Studio 5000 (aka Rockwell) programming, I use the L5X files and remove a lot of stuff that is making diffing and merging annoying and split the big file into one per AOI (aka function) and program.
See https://codeberg.org/aliksimon/l5x-git-tools for the code that I use as a pre-commit hook.
It has several customization options in the hope that it can be useful for others as well.
It makes unnecessary merge conflicts very rare but does not help much with graphical PLC languages.
Had the same issue. In my case, setting "Delegate IDE build/run actions to Maven" solved it.
Seetings - Build, Execution, Deployment - Build Tools - Maven - Runner
Here in 2025 and API still does not support it.
Check if your modules are up to date. If not update them using pip.
Also instead of using python <filename>
to run your file use to command streamlit run <filename>
.success-message {
text-align: center;
max-width: 500px;
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
}
.success-message__icon {
max-width: 75px;
}
.success-message__title {
color: #3DC480;
transform: translateY(25px);
opacity: 0;
transition: all 200ms ease;
}
.success-message__title {
transform: translateY(0);
opacity: 1;
}
.success-message__content {
color: #B8BABB;
transform: translateY(25px);
opacity: 0;
transition: all 200ms ease;
transition-delay: 50ms;
}
.success-message__content {
transform: translateY(0);
opacity: 1;
}
.icon-checkmark circle {
fill: #3DC480;
transform-origin: 50% 50%;
transform: scale(0);
transition: transform 200ms cubic-bezier(.22, .96, .38, .98);
}
.icon-checkmark path {
transition: stroke-dashoffset 350ms ease;
transition-delay: 100ms;
}
.icon-checkmark circle {
transform: scale(1);
}
Maybe these are nice minor adjustments:
const joinByDelimiterButKeepAsArray = <T, D>(arr: T[], delimiter: D): (T | D)[] => {
return arr.flatMap((item, i) => i == 0 ? item : [delimiter, item])
}
windowOptOutEdgeToEdgeEnforcement worked for me
I am currently working a project like this. Although I cannot provide you the actual code, I can provide you the blog post and reference that actually do this.
Both post provided code examples and the first one goes more into the theory behind this. I hope this helps.
I can't comment so will probably delete the answer later but i think git attributes is the way to go. https://git-scm.com/book/en/v2/Customizing-Git-Git-Attributes#_merge_strategies
If you would like to enable screen capture, you must start the application with the --allow-screencapture command line flag
More info : https://keepassxc.org/docs/KeePassXC_UserGuide
i guess Hyunny is zzang............
To ensure your validation works as expected, I recommend using the refine method in your validation schema. This approach allows you to implement more complex and customized validation techniques. Also instead of manually triggering validation with verifyOtpTrigger('otp'), it's generally more efficient to use handleSubmit for form validation.
Here’s an example of how you can implement basic otp form:
import { NumericFormat } from "react-number-format";
import { zodResolver } from "@hookform/resolvers/zod";
import { Controller, useFieldArray, useForm } from "react-hook-form";
import { Button, FormHelperText, Grid, TextField } from "@mui/material";
import { defaultValues, otpSchema, OtpValues } from "./otp-form.configs";
export const OtpForm = () => {
const form = useForm<OtpValues>({
defaultValues,
resolver: zodResolver(otpSchema),
});
const { fields } = useFieldArray<OtpValues>({
control: form.control,
name: "otp",
});
const errors = form.formState.errors;
const verifyOtpCode = (values: OtpValues): void => {
console.log(values);
};
return (
<form onSubmit={form.handleSubmit(verifyOtpCode)}>
<Grid container={true}>
{fields.map((field, index) => (
<Grid item={true} key={field.id}>
<Controller
name={`otp.${index}.value`}
control={form.control}
render={({ field: { ref, onChange, ...field } }) => (
<NumericFormat
customInput={TextField}
{...field}
inputRef={ref}
inputProps={{ maxLength: 1 }}
size="small"
onValueChange={({ floatValue }) =>
onChange(floatValue ?? null)
}
sx={{ width: 40 }}
/>
)}
/>
</Grid>
))}
</Grid>
{errors?.otp?.root && (
<FormHelperText error={true}>{errors.otp.root.message}</FormHelperText>
)}
<Button type="submit" variant="contained">
Verify OTP
</Button>
</form>
);
};
import { z } from "zod";
// TODO: move to the /shared/error-messages/otp.messages.ts
const OTP_CODE_INVALID = "Please provide a valid OTP code.";
export const otpSchema = z.object({
otp: z
.array(z.object({ value: z.number().nullable() }))
// Using refine is important here because we want to return only a single error message in the array of errors.
// Without it, we would receive individual errors for each of the 6 items in the array.
.refine((codes) => codes.every((code) => code.value !== null), OTP_CODE_INVALID),
});
export type OtpValues = z.infer<typeof otpSchema>;
export const defaultValues: OtpValues = {
otp: new Array(6).fill({ value: null }),
};
Thanks @falselight, your right about this, for ubuntu you should put the GeoIP.conf in /etc/GeoIP.conf. worked for me
I am also trying to implement meta ads in iOS through bidding but can't get the code to, can you please share the meta setup to load ads using bidding.