Selling also called writing a put option allows an investor to potentially own the underlying security at a future date and at a much more favorable price. In other words, the sale of put options allows market players to gain bullish exposure, with the added benefit of potentially owning the underlying security at a future date and at a price below the current market price.
Basel III, sometimes known as Basel 3, was signed in December 2010 and is the third of the Basel Accords. These agreements deal with risk management in the banking industry. Basel III is the global regulatory standard for bank capital adequacy, stress testing, and market liquidity risk, so to speak. (Earlier versions of the same, Basel I and Basel II, were less strict.) Norms will be phased in beginning March 31, 2015, and will be fully implemented by March 31, 2018. What is the purpose of Basel III? According to the Basel Committee on Banking Supervision, “Basel III is a comprehensive package of reform initiatives devised by the Basel Committee on Banking Supervision to improve the supervision, regulation, and risk management of the banking system.” Basel III is thus only a continuation of the Basel Committee on Banking Supervision’s efforts to strengthen the banking regulatory framework under Basel I and Basel II. This new Accord aims to boost the banking sector’s ability to deal with financial and economic stress, as well as risk management and transparency. The Basel 3 measures are aimed to: · Enhance the banking sector’s ability to absorb the ups and downs that come with financial and economic uncertainty. · Improve risk management and governance in the banking industry. · Transparency and disclosures in banks should be improved. As a result, we can state that the Basel III rules are aimed at improving banks’ capacity to resist moments of economic and financial stress, as the new criteria are more strict than previous capital and liquidity requirements in the banking industry. What are the primary differences between Basel iii and previous accords, such as Basel I and Basel ii? Better Capital Quality: The implementation of a considerably tighter definition of capital is one of the fundamental features of Basel 3. The bigger the loss-absorbing capacity of capital, the better. As a result, banks will be stronger, allowing them to better endure stressful circumstances. Capital Conservation Buffer: Another important component of Basel iii is that banks will now be obliged to maintain a capital conservation buffer of 2.5 percent. The goal of the conservation buffer request is to guarantee that banks have a capital cushion that may be utilized to absorb losses during times of financial and economic hardship. Countercyclical Buffer: Another crucial feature of Basel III is the countercyclical buffer. The countercyclical buffer was created with the goal of increasing capital requirements during good times and lowering them during poor times. When the buffer overheats, it will delay banking activity and boost lending when things are rough, i.e., in poor times. The buffer will be made up of common stock or other fully loss-absorbing capital and will vary from 0% to 2.5 percent. Common Equity and Tier 1 Capital Requirements: Minimum Common Equity and Tier 1 Capital Requirements: Basel III increased the minimum requirement for common stock, the strongest type of loss-absorbing capital, from 2% to 4.5 percent of total risk-weighted assets. The entire Tier 1 capital requirement, which includes not just common equity but also other qualified financial instruments, will also rise from 4 percent to 6 percent. Although the minimum total capital requirement will stay at 8%, when paired with the conservation buffer, the needed total capital will rise to 10.5 percent. Leverage Ratio: An examination of the 2008 financial crisis found that the value of several assets declined faster than expected based on previous experience. As a result, Basel III rules now incorporate a leverage ratio as a safety net. The leverage ratio is the quantity of capital compared to total assets (not risk-weighted). How Will Basel III Affect Indian Banks? Basel III, which must be implemented by Indian banks in accordance with RBI norms, would be a difficult undertaking not just for banks but also for the Indian government. Indian banks are expected to need to raise Rs 6,00,000 crores in external capital during the nine years, or by 2020. (Estimates vary from organization to organization). The returns on equity of these institutions, particularly public sector banks, will be impacted by such capital expansion. The fact that Indian banks have typically kept core and total capital levels considerably above the statutory requirement is the only comfort.
Market watchers have observed the striking similarities between today’s industry and what happened when the dot-com bubble burst in 2000. If you, your parents, or your grandparents were harmed by the dot-com crisis of 2000, you might be wondering if there’s anything you can do to help safeguard your portfolio this time. Here are some learning lessons from the dot-com bubble and the subsequent financial disaster. Why Did the Dotcom Bubble Burst, and What Caused It? Investors were enamored with all thing’s internet-related in the mid-1990s. Stocks in dotcoms and other technology companies skyrocketed. Money was poured into computer and internet start-ups by venture capitalists. And zealous investors, frequently swayed by the excitement rather than the facts, continued to acquire shares in firms facing severe obstacles, betting that they would eventually succeed. That, however, did not occur. As firms ran out of cash and new sources of finance dried up, the excitement gave way to pessimism. Insiders and more knowledgeable investors began to liquidate their holdings. Average investors, many of whom entered the market later than the sophisticated money, lost money.
The capacity to analyze data and conduct complicated computations at rapid rates is known as high performance computing (HPC). A 3 GHz CPU on a laptop or desktop computer can do roughly 3 billion calculations per second. While this is significantly quicker than a person, it is nothing compared to HPC systems, which can execute quadrillions of computations per second. Data is used to make new scientific discoveries, power game-changing inventions, and enhance the quality of life for billions of people all over the world. HPC is at the heart of scientific, industrial, and societal progress. The size and volume of data that enterprises must work with is rising quickly as technologies like the Internet of Things (IoT), artificial intelligence (AI), and 3-D imaging advance. The capacity to analyze data in real time is critical for many applications, including broadcasting a live sporting event, following a growing storm, testing new goods, and evaluating market movements. Organizations require lightning-fast, highly dependable IT infrastructure to process, store, and analyze huge volumes of data to stay ahead of the competition.
Step 1 :- Identify your Market No strategy works for all the market. Neither should you force try and fit a proven strategy on a different market. The market can be anything but you need to look closely at the following points while choosing a market. The liquidity of the market :- You have to choose the market such that your orders can be executed smoothly without impacting the price of the instrument in the market. If your capital size is in millions than naturally you wont be able to buy penny stocks as your order will take the price to very high extent and you wont be able to get out of the position. The ease of execution :- The instruments in the markets should not regularly move into ban periods or no government interference should be there on the set of stocks you trade. Step 2 :- The timeframe and the frequency of trade This is a critical step that will force you to make some tough trade-offs. A scalping strategy can generate good amount of profits but a major part of the profits gets eaten away in terms of brokerages and charges. At the same time a strategy with very low trade frequency will generate too less signals to make significant profits. However, this can be taken as variable and can be fixated later on depending on the backtest result. Step 3 :- Forming Hypothesis This step needs a good amount of brainstorming. For beginners there are multiple strategies you can find for free on various platforms and tweak them (cause they don’t work). Once you have your trading rules are clear, you can proceed to the next step. For a simple strategy here are some of the rules you need to get clear in your strategy :- Entry Point Entry Time (if) Exit Point Exit Time (if) Target Stop Loss You can add as many parameters as you may like depending on the complexity of the strategy Step 4 :- Historical Data Collection Now comes the validation part, you need to validate the strategy with it’s performance over historical data. Depending on the your market and time-frame selection you’ll have to collect the data to a significant past. In case the Stock is widely in circulation and your strategy runs over daily data, you can easily find data from yahoo finance and other popular sites. Step 5 :- Backtesting and Result Analysis This is the most crucial step in the entire strategy generation. Backtesting is where you run the strategy over the historical data and generate report. A strategy can seem profitable over previous data but still make you loss. There are multiple parameters you have to test your strategy on. You can visit BETAQUANTS for more details. Step 6 :- So now you are pretty convinced of your backtest results and you want to take the strategy live. You first need to do forward testing what is generally termed as paper trading. In manual paper trading people have different emotions while in with the actual money the emotions change and to handle that, we strongly suggest to go for trade automation. There are number of different factors that go as subtopics of the above topics. But I hope you now have a good idea of the setup you need to do for generating a strategy. In case of any queries, contact us at BETAQUANTS and we’ll be happy to help. Also don’t forget to check out our services and educational content. Thank You
The year was 2017, and Donald Trump was the most powerful leader on earth. The market was booming and had recovered from post housing crash crisis. Everything was rosy apart from some periodic outbursts from the president’s comments that sometimes made the market nervous. All assets were mostly in bullish mode for most of the year. Americans were enjoying the lowest tax rate ever. The gate of easy cash created a wild flow into investment vehicles. There was some sense of caution that the great recession had occurred almost ten years ago and might repeat. There were talks about how the Fed planned to raise interest rates at least three more times the following year; thus it was a bad sign for gold price. Despite the bitcoin competition, the Fed tightening cycle, subdued inflation, tightening labor market, stock market boom, and accelerating global growth, gold could still gain 10% plus by the end of the year. Move forward; the year after was 2018. By the end of the year, gold price ended at $1268 while it entered at $1312.05 per oz. The Fed tightening to raise rates four times in 2018 created a downturn for the gold price in 2018. Stronger US Dollar due to rising rates had put pressure on the gold price. Despite the trade war of imposing tariffs between China and the USA, the Fed’s action won the greater hand in pressurizing the gold price. In 2019, gold performance was impressive. A staggering 18% plus was gained by this yellow metal. Trade wars, geopolitical risk, unresolved Brexit negotiations, and inverted yield curves were some of the factors that favored gold. Although there were talks about peace over the trade wars between China and the United States, some investors were unsure about the outcome of the peace talk. At the year, Central Banks purchased 12% more than in 2018. Some countries that had not been buying gold, such as Hungary, Colombia, and the Philippines, started to buy gold. Most notably, Russia and China’s central banks bought gold in a significant amount. $1476 per oz was the year exit price of the yellow metal. Many pundits expected gold prices to decrease in 2020 due to the prospect of peace talks between China and the USA over the trade war. But, an unexpected pandemic happened in 2020. Covid-19 was the culprit for the bad performance of the world economy during this year. Central banks worldwide promised low-interest rates to sustain the economies during the lock-down. These low rates push up the price of gold up until $2000 plus per oz at one point in 2020. The year 2021 was a year of recovery. Low rates and government stimulus had created a robust market. Stocks went up, and so did cryptocurrencies, but it was a terrible year for gold. The world GDP rebounded greatly after the economic crisis of 2020. Everybody you know was into cryptocurrencies due to massive money printing and paychecks in the mail. Even though some inflationary fears circulated in the market, gold prices still did not gain traction. The expectation of rising interest rates put a headwind on the gold price. Countries were started to move away from pandemics due to the mutation to less severe coronavirus and a smaller number of infection cases. Everything looked better, but not for gold. Now the current year has an interesting edge for gold price. 2022 started the year with a Russian attack on Ukraine. Despite the prospect of rising interest rates that are aimed at curbing inflation, the Russian war makes gold shine. Ukraine war has created a massive supply chain problem. Wheat, oil, and sunflower oil are some of the commodities that get affected by Russian war. As the result, inflation gets wild, and there are talks about interest rates hikes that potentially can turn the market into a plunge or recession. Thankyou !!
It’s a fight and you have a fight with a much larger man who knocks people out cold So, what you should do: 1- Study how he fights and what his tendencies are … 2- Or just don’t do anything So a lot of people trying to learn every strategy in a book and yet it won’t matter because the win rate is small and the loss is big. But the good news is that this enemy can make you also win big. So, in FOREX market, do you know who can make price go up and down, it’s not us, this is not stock trading, or central governments either. FOREX is 4–5 Trillion dollar a day market, it would take entities with extraordinary trading capital to move the market every day. We called them “Big Banks “ Interbank Market dominate the FOREX market for almost 50 % . We can name them: Deutsche Bank Citi JP Morgan Chase HSBC Maybe a Chinese Bank also Now when you understand how they do it and how this game is created ,you can make sure that you will avoid a lot of losses and increase your win profits. So, if you are a trader for these banks your job is to: 1- Take money out of the spot FOREX pool 2- Redistribute that money back into the market, so the price will go up or down. Now the real take begins, whose money do they take? Before that I want you to know this, Traders for the bin banks get a chance to see something most of us cannot, where the money is sitting , In another word if the most of the money is currently long or short and most of the pending orders are long or short. Let me give you an example : Imagine that most orders and the money are net long: 1- The big banks takes the price long and reward everyone who went long (Big banks lose) 2- Take the price immediately short forcing those who went long to exit out at a loss (Big banks win) 3- Take the price again long, just to trip those long orders then take the price short (Big banks win even more ) Most of the time it’s 2–3 Now the big question is Why Big banks get you every time? Most forex traders use technical analysis to trade which is good ,technical analysis is key to beat in this game The problem they do it wrong There is something we can call it “Dirty Dozen” 12 technical indicator that you will face and see everywhere (Youtube-blogs-..) All people talk about using them to win and gain a lot of money So if 90 % of traders are using this 12 indicators so they are telling the big banks where they are going long or short so they are saying Hey big banks we are going this way please bit us Humans are emotional so if you make money using this 12 indicator you will keep in the game which is option 1 where big banks lose so you will think that you will get rich and the feeling of gaining that amount of money , No you are starting to lose now so in short Big banks give you money to keep you in the game and extracting money from you when they can. So how we can win in this game, Well we don’t try to beat the big banks But we can use great technical indicator which can predict a very accurate where price is going Also by making sure to avoid the tools that make us part of the popular crowd that use the 12 dirty dozen
Portfolio Optimization is an approach to maximizing portfolio return while minimizing risk. Modern portfolio theory was introduced by Harry Markowitz in 1952 through his doctoral thesis. Markowitz’s model assumes that a portfolio is designed to maximize its expected return and related to the given risk. In short, to achieve a higher expected return, a greater risk is necessary. So, there is a trade-off between risk and expected return. The key thing about portfolio optimization is diversification. According to the theory, investors prefer less risky assets to the riskier ones for a given level of return. The theory concludes that most investors should invest in multiple asset classes to minimize risk and maximize return. Modern portfolio theory calculates the expected portfolio return by the weighted sum of the returns of the individual assets. For example, if there are five assets with the returns of 5%, 7%, 10%, 4%, and 8% and the weight of each asset is similar, then the return of the portfolio would be (20% x 5%) + (20% x 7%) + (20% x 10%) + (20% x 4%) + (20% x 8%) = 6.8%. The risk of the portfolio is the variance of each asset and the correlations of each asset pair. The greater the variance, the greater the return; this is the believed concept of portfolio optimization. Apart from variance, other risk measures such as Sortino Ratio, Conditional Value at Risk, maximum drawdown, and statistical dispersion can be used to measure or gauge the risk. Some quantitative techniques such as Monte Carlo simulation can also be used for risk assessment. Exchange-Traded Funds (ETF) can be added to a portfolio for optimization. The inclusion of bonds that negatively correlate with stocks has decreased the volatility or the variance of the portfolio performance. Investors can compose their portfolios based on their risk appetite. A portfolio profile can be plotted on a graph with the portfolio’s risk on the X-axis and the expected return on the Y-axis. The plot will show the most desirable risk-reward profile of a portfolio. For example, if portfolio one has a standard deviation of 7% and an expected return of 8% and portfolio two has a standard deviation of 9% and an expected return of 8%, we will go for portfolio one since with the same return profile we can have a smaller risk. We can connect all the most efficient portfolios in an upward sloping curve to see the optimized portfolios. The curve is known as the efficient frontier. Investing in the portfolios below the curve is not desirable since it does not maximize return for a given level of risk. In conclusion, portfolio optimization is composed of creating a portfolio where the highest return is achieved, and the lowest risk is attained. Thankyou
The usage of technology brings a significant impact on the Indian stock market. Many kinds of technologies have changed the landscape of investment venues. Sentiment analysis or opinion mining uses these technologies, such as text mining and natural language processing (NLP), to analyze users’ opinions, evaluations, sentiments, attitudes, and emotions. This research work has performed the importance of sentiments analysis on the stock market such as Sensex and Nifty to predict the price of stocks. The utilization of social networking sites like Facebook by the masses is beneficial for business analysts for mining users’ opinions about the business products and using the opinions as to the feedback on their products’ performance. Then, the analysis can be used to enhance the performance of their products. Sentiment analysis is used to gather opinions by users and classify the opinions as positive, negative, or neutral. However, there are some challenges in sentiment analysis. The first challenge is a word that is used to express an opinion; it can be varied in meaning depending on the sentiment. For example, the word “large” if used for the size of a mobile device, can mean negative, while if used for the height of a person, it can mean positive, so subjectivity is an issue. The second challenge is related to the opinion holder as the opinion holder changes his mind subjectively. It can be a challenge for a machine to understand the meaning of some words. For example:” I like the picture quality, but the battery life is poor” that statement is a combination of positive and negative. Also, a short statement can be hard to understand for humans and machines. Indian stock market has gained interest for investors; they are investing in two main stock market indices named Bombay Stock Exchange and National Stock Exchange. These two exchanges are the prominent exchanges in India. So, there is a need to predict the stock prices for these indices. Thus, sentiment analysis is used for these indices. Now we are going to talk about sentiment classification techniques. These techniques can be divided into the Machine Learning and Lexicon approaches. The Machine Learning approach can be divided into supervised and unsupervised learning methods. Unsupervised learning is done without any explicit target output associated with the input. The machine learns things through observation. The famous approach in unsupervised learning is clustering which clusters similar data accordingly. Some clustering algorithms are K-means, Hierarchical, Gaussian Mixture models, Self-organizing maps, and hidden Markov models. Supervised learning makes use of known datasets to make a prediction of the output. Supervised learning requires train and test sets. Various supervised learning techniques are decision tree classifier, rule-based classifier, probabilistic classifier, and linear classifier. The decision tree uses a hierarchical decomposition of training data where the data is divided based on attributes. The division is based on the presence or absence of certain words. Each non-leaf node is associated with a feature and classification value of positive or negative. A rule-based classifier is based on the emotions in the text. For example, emotionally positive words are classified as positive and vice versa. The probabilistic classifier is based on a prediction of input when it is given the probability distribution. A Naïve Bayes classifier is a probabilistic model where the norm is to estimate the probability of a text whether it belongs to a negative or positive class. A maximum entropy classifier is another probabilistic model where the largest entropy is chosen for the classifier. Lastly, a linear classifier partitions a set of objects into its respective domain. Two famous classifiers are Support Vector Machine and Neural Networks. Support Vector Machine is used widely for classification and regression analysis. Support Vector Machine determines the linear separator in the search space that separates different classes. Neural Network is based on multilayer neural networks for non-linear boundaries that are used to enclose regions of a particular class. An Artificial Neural Network linearly combines the neuron’s input with different weights. Then, the combination is fed into a non-linear activation unit which can be a threshold unit in its simplest form. A neural network offers non-linearity, input-output mapping, adaptivity, and fault tolerance. The lexicon-based approach results in good cross-domain performance. This method assumes that the sum of sentiment orientation of each word creates a contextual sentiment orientation. There are two types of lexicon-based approach methods and they are the dictionary-based approach and the corpus-based approach. The dictionary-based approach uses predefined words where each word associates itself with some sentiment polarity. Emotions of happiness, sadness, and depression can be found in the Lexicon from dictionaries. The corpus-based approach tries to find common patterns of words to determine sentiments. The approach is based on a seeding list of words and then finding other opinion words with similar contexts. For example, this method tries to assign the happiness factor of words that depends on the frequency of their occurrence in happy or sad in a blog post. The last methodology is the hybrid system. The hybrid system is classified as a sequential hybrid, auxiliary hybrid, and embedded hybrid. A sequential hybrid system uses technology in a pipeline-like structure. A subroutine is used in an auxiliary hybrid system to manipulate the provided information. In an embedded system, the participating technology is integrated so that it appears intertwined. In a recent study by Singh, P. K et al., Sentiment analysis was applied across Flipkart E-commerce websites to filter irrelevant reviews. In another study by Gunduz et al., sentiment analysis was conducted to determine the relationship between a university’s academic success and sentiment about the university on social media based on the Naïve Bayes classifier. The top 10 most successful Turkish Universities ranked by URAP were selected for sentiment analysis based on their social media. Twitter was chosen as the media for this study. Tweets were collected via Twitter REST API and labeled as positive, negative, or neutral. Then the tweets are processed accordingly. From the evaluated results, the system yields a 72.33% success rate. Molla et al. analyzed users’ opinions of Samsung products. NodeXL was used for visualizing the social network graph. Future work may focus on the location management of each tweet and the inclusion of emotions. Lu, Y. and Chen, J studied the opinion analysis of microblog content. The result showed a precise classification that exceeded 90% using a classifier support vector machine. Batool, R et analyzed 4000 tweets for data and sentiment classification that contain food, diabetes, education, and movies. He used the first knowledge generator to classify tweets into different categories and enhance knowledge with a synonym binder to increase information gain. Results gave a significant improvement from 0.1% to 55%. M. Merah and B. Diri performed sentiment analysis on Turkish tweets by using Naïve Bayes, Support Vector Machines, and Random Forest Classifications. From the experiment, Support Vector Machine gave the best results compared to other classifiers. Li, s., Wang et al. predicted the success rate of a movie by using Twitter data. Lingpipe sentiment analyzer was the tool of choice for analyzing the data, and the result showed a prediction accuracy of 64.4% better than the conventional system. Wang, X., and Luo conducted a study to predict movie performance based on social networking data using a sentiment analysis technique involving a K-means clustering algorithm. The prime aim of this research work is to fetch live server data by utilizing Python programming language for performing sentiment analysis. First, Python is installed on the Ubuntu 14.04 LTS host machine, then the required software such as Beautiful Soup is installed using a command prompt. Beautiful Soup uses a Python library for fetching live data that pulls and saves desired contents from desired webpages. Beautiful Soup supports an HTML parser that is included in Python standard library. The steps needed for sentiment analysis using Python are given as follows: Install Python and the required packages to run Python on Ubuntu terminal and Fetch Sensex and Nifty Live data for sentiment analysis pre-processing the fetched data for feature selection Sentiment analysis for stock market prediction