
Iron-Pen☑️
Bio
I hold an unending passion for words, with every letter carrying a piece of my soul. Each story is a journey to explore myself and the world. I aim to be a voice for the voiceless and sow seeds of hope and change in readers' hearts.
Stories (22)
Filter by community
Don't prolong the wait...!
Many people lose their way in the sea of life. Their lives pass in moaning, crying, wailing, complaining, and blaming, waiting for someone to extend a helping hand to rescue them, drowning in seas of waiting for the legions of saviors.
By Iron-Pen☑️ about a year ago in Motivation
How do the Japanese maintain their fitness for a lifetime?
The culture of fitness and discipline in exercising at one of the sports clubs has become a societal phenomenon. This has led to companies competing to provide gyms and sports clubs in neighborhoods and cities for both men and women. Nowadays, few people do not participate in one of them, and it has become common to see some sharing videos on social media of themselves exercising, just as they do with posts about their meals at their favorite restaurants.
By Iron-Pen☑️ about a year ago in Motivation
What is creative thinking and what do we mean by creativity?
In fact, there are many definitions of creativity, so we will mention some definitions; one of the easiest of these definitions is the following definition: “The process that leads to the creation of new ideas, which are useful and socially acceptable when implemented.
By Iron-Pen☑️ about a year ago in Motivation
Love, Desire, and the Ache Within
I have never tried to write on such topics, so this article will be the first of its kind regarding my new approach to writing, hoping to write things that will help me express what I think and try to understand many of the things I am trying to understand.
By Iron-Pen☑️ about a year ago in Confessions
Dignity... Between Pain and Struggle
Our question is simple : We often use the concept of dignity without thinking about its content and, strangely enough, we don't even have to think about its content and concept when we say “my dignity comes first” or “he is beneath my dignity” or “he insulted my dignity.” The intuitive clarity of the word is not enough, but its intellectual, linguistic and traditional concept, although not clearly represented, is due to the fact that ethics is generally poorly studied, We can talk about “moral dignity” as a commitment to the set of sustainable norms adopted in society, and we can also talk about “communicative dignity” as a logic of recognition of the interlocutor, and when communication is built through the idea of dignity and recognition we can talk about moral dignity and when it is said that someone is held with dignity, that is, in their representational form, they speak and think holding on to a set of symbols of cultural expression.
By Iron-Pen☑️ about a year ago in Psyche
The Day I Learned It’s All Small Stuff
We sat on the knob of Mica peak, which was rocky and covered in quartz, and the iron gray clouds crushed down on us. We could see the mountain ranges of northeastern Washington and Idaho from our vantage point above the valley.
By Iron-Pen☑️ about a year ago in Confessions
AI Has Reached Its Breaking Point
Several research and AI specialists have pointed to the fact that the quantity of training data, processing power, and energy utilization would need to expand dramatically. But it just isn’t doable, and even back in April, OpenAI and other generative AI startups looked to already be butting up against severe restrictions in these areas. However, subsequent reporting, interviews, and research have now officially proven what I and many others expected. What this implies for the AI business and the economy at large might be disastrous. These stories came from The Information and Reuters. The Information released a piece revealing how their next text AI, dubbed Orion, is just modestly better than their existing Chat GPT-4o model despite utilizing a significantly bigger training dataset. The magazine reported that “some researchers at the company believe Orion isn’t reliably better than its predecessor in handling certain tasks” and that “Orion performs better at language tasks but may not outperform previous models at tasks such as coding.” According to The Information, Orion attained GPT-4 levels of competence after being taught on only 20% of its training material but scarcely progressed after that. Because AI training approaches have reportedly hit a stalemate in recent years, we may make an informed judgment that this implies Orion has a training dataset five times bigger than GPT-4, but it is not appreciably better. This nicely shows and verifies the diminishing returns problem. To drive this concern home even harder, Reuters interviewed the newly fired OpenAI co-founder, Ilya Sutskever. In the interview, Sutskever said that the firm’s latest experiments aiming to scale up its models show that such efforts had plateaued. As far as he is concerned, AI cannot grow better by merely giving it more data. Recent findings also corroborate with Sutskever and explain why Orion is ultimately a bit crap. One of these research indicated that when AI models are given more data and develop bigger, they don’t get widely better but gain better at specialized tasks at the expense of their wider usefulness. You can see this in OpenAI’s o1 model, which is bigger than GPT-4o and is better at solving mathematical problems but is not as good at writing effectively. You may also see this with Tesla’s FSD. As the program became stronger at managing increasingly complicated traffic difficulties, it apparently started to lose fundamental driving abilities and began to curb corners. Yet another business determined that, at their present pace, generative AI companies like OpenAI would run out of high-quality new data to construct their AIs on by 2026! As such, making AIs better by merely expanding these models bigger won’t be a feasible choice in the very near future. Indeed, others have speculated that the reason Orion is underperforming is because OpenAI can’t gather enough data to make it any better than GPT-4o. But, any way, this demonstrates that the predictions of generative AI suddenly stalling have come true. There are several answers to this, such improving how AIs are created to decrease the training data required, operating numerous AIs simultaneously, or introducing new computational architecture to make AI infrastructure significantly more efficient. However, all of these ideas are in their infancy and are years away from becoming deployable. What’s more, these solutions merely push this problem down the road, since all they do is make AI slightly more efficient with its energy and data, and they also don’t tackle the question of where these corporations will obtain more fresh, high-quality data in the future. So, why does this matter? Well, big tech has put billions of dollars on AI on the premise that it would become exponentially better and be massively lucrative in the future. Sadly however, we now know that just won’t happen. Take OpenAI. A few months ago, it was forecast to report a $5 billion yearly deficit and even face bankruptcy. Even worse, the AIs that it did release aren’t lucrative despite hundreds of millions of users. So even if OpenAI didn’t spend a dime on generating any new models, it would still plummet. Yet, even despite this, it was able to secure several billion dollars in credit and an additional $6.6 billion in fresh capital, giving the artificial intelligence business a stunning $157 billion value and preserving it from collapse. However, given their present pace of loss and development expenditures, this is just enough to save them from desolation for another year. This, paired with the officially certified severely falling profits, implies that some of the largest and most prominent industries and biggest investment companies in the world are supporting a fundamentally defective product. The last time our economy did this, it generated one of the biggest financial crises in living memory: the credit crisis of 2008.
By Iron-Pen☑️ about a year ago in 01





