Saturday, May 8, 2021

金融思维(2)比特币到底是不是新黄金

 比特币不是新黄金,也成不了新黄金。
 
首先,比特币是一种虚拟资产,本质上它不过是电脑游戏而已,而黄金是实物资产,可以进行实物交易。
 
其次,任何东西,不管在市场上值多少钱,都需要有实际用途来支撑它的价格。比特币没有任何实际用途,而黄金有实际用途,现在全世界的黄金中,50%用于制作珠宝首饰,10%用于生产高科技产品,包括用于挖比特币的电脑芯片。还有,几千年来人们一直相信黄金,这种心理作用也相当重要。
 
最后,黄金有一个巨大的市场,能够在全球各地的交易所进行交易。仅在伦敦,每天交易的黄金实物的价值就达到大约180亿美元。因此,黄金是最具流动性的资产之一。所谓流动性,就是变现的能力。黄金的流动性很好,就是说很容易把黄金变成现金。比特币的市场虽然迅速扩张,但还远远没有达到这样的规模。而且,比特币的总数限制在2 100万个,即使到2140年比特币还没有死掉,比特币的市场规模也很难比得上黄金。比特币的市场规模小,会导致什么后果?一旦有人抛售,就可能引发暴跌。
 
比特币不是货币,未来成为真正货币的可能性也几乎不存在。
 
明朝为什么会灭亡?明朝的灭亡与货币有关。明朝中后期,今天江浙一带的经济已经相当繁荣,丝绸等手工业已经相当发达。明朝使用白银作为货币,但问题是中国一直就不是一个产银大国,明朝的白银主要来自欧洲与日本。后来欧洲与日本都断绝了对明朝的白银供应,江南的手工业找不到生产所需要的资金,于是纷纷倒闭,明朝的经济就垮掉了,最后明朝灭亡了。
 
这件事跟比特币有什么关系呢?关系在于,明朝以白银作为货币是一个极大的错误,如果各国现在以比特币作为货币,也将重蹈明朝的覆辙。
 
—— 李国平,《金融思维》,货与币必须平衡。中信出版社,2020

Friday, May 7, 2021

金融思维(1)区块链到底是泡沫还是新机遇

区块链是一个技术,而且如同20世纪90年代时的互联网技术一样,区块链也是基础技术或者叫底层技术。基础技术就好像是地基,把地基打好了,做什么都可以。作为一个基础技术,区块链是靠谱的,这是没有什么问题的。比特币、以太币等,都只是区块链的应用,所以,比特币不等于区块链。如果哪天比特币出了问题,那是区块链在应用过程中出了问题,并不是区块链本身出了问题。这就好像阿里巴巴、百度、腾讯等都使用了互联网这个底层技术,但这三家公司本身并不是互联网技术,要是这三家公司出了问题,那是公司自己的问题,不是互联网本身出了问题。

其次,区块链包含很多个层级与应用,有做基础技术的、有做数据处理的、有做智能合约的,等等。但是,目前区块链好像只是用于数字货币,比如比特币、以太币等,其他的应用还没有很好地开发出来。

再次,现在的区块链就如同1995年前后的互联网。1995年前后,互联网刚出现不久,也是被炒作得相当厉害,导致了1999—2000年的互联网泡沫。在互联网泡沫期间,很多公司只要用上dot.com,股价立即暴涨,AOL(美国在线)就是一个非常典型的例子。到2001年,互联网泡沫破灭,很多互联网企业倒闭了。大浪淘沙之后,留下了亚马逊等真正有价值的互联网企业。

在商业领域,很多时候先驱会成为先烈,即最先创建的企业往往倒闭,从而为后来的企业提供经验教训。例如,MySpace(聚友网)成为互联网社交领域的先驱与先烈,而Facebook(脸书)则在MySpace的基础上发展成为一个垄断型的网络社交公司。区块链现在还没到起飞的阶段,而现在创建的区块链企业、区块链项目很可能成为先烈。
 
—— 李国平,《金融思维》,货与币必须平衡。中信出版社,2020

Thursday, April 29, 2021

Don’t Click LIKE

 To click “Like,” within the precise definitions of information theory, is literally the least informative type of nontrivial communication, providing only a minimal one bit of information about the state of the sender (the person clicking the icon on a post) to the receiver (the person who published the post).

 

Instead of seeing these easy clicks as a fun way to nudge a friend, start treating them as poison to your attempts to cultivate a meaningful social life. Put simply, you should stop using them. Don’t click “Like.” Ever. And while you’re at it, stop leaving comments on social media posts as well. No “so cute!” or “so cool!” Remain silent.

 

The reason I’m suggesting such a hard stance against these seemingly innocuous interactions is that they teach your mind that connection is a reasonable alternative to conversation. The motivating premise behind my conversation-centric communication philosophy is that once you accept this equality, despite your good intentions, the role of low-value interactions will inevitably expand until it begins to push out the high-value socializing that actually matters. If you eliminate these trivial interactions cold turkey, you send your mind a clear message: conversation is what counts—don’t be distracted from this reality by the shiny stuff on your screen. As I mentioned before, you may think you can balance both types of interaction, but most people can’t.

 

I want you to replace this with a state where your leisure time is now filled with better pursuits, many of which will exist primarily in the physical world. In this new state, digital technology is still present, but now subordinated to a support role: helping you to set up or maintain your leisure activities, but not acting as the primary source of leisure itself. 


Digital Minimalism: Choosing a Focused Life in a Noisy World, by Cal Newport

Wednesday, April 28, 2021

Reclaiming Conversation

To be clear, conversation-centric communication requires sacrifices. If you adopt this philosophy, you’ll almost certainly reduce the number of people with whom you have an active relationship. Real conversation takes time, and the total number of people for which you can uphold this standard will be significantly less than the total number of people you can follow, retweet, “like,” and occasionally leave a comment for on social media, or ping with the occasional text. Once you no longer count the latter activities as meaningful interaction, your social circle will seem at first to contract.

This sense of contraction, however, is illusory. Conversation is the good stuff; it’s what we crave as humans and what provides us with the sense of community and belonging necessary to thrive. Connection, on the other hand, though appealing in the moment, provides very little of what we need.

Digital Minimalism: Choosing a Focused Life in a Noisy World, by Cal Newport

Tuesday, April 27, 2021

iGen

Young people born between 1995 and 2012, a group Twenge calls “iGen,” exhibited remarkable differences as compared to the Millennials that preceded them. One of the biggest and most troubling changes was iGen’s psychological health. “Rates of teen depression and suicide have skyrocketed,” Twenge writes, with much of this seemingly due to a massive increase in anxiety disorders. “It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades.”

 What instigated these changes? Twenge agrees with the intuition of the university mental health administrator when she notes that these shifts in mental health correspond “exactly” to the moment when American smartphone ownership became ubiquitous. The defining trait of iGen, is that they grew up with iPhones and social media, and don’t remember a time before constant access to the internet. They’re paying a price for this distinction with their mental health. “Much of this deterioration can be traced to their phones,” Twenge concludes.

 

As part of his reporting, Denizet-Lewis interviewed Jean Twenge, who made it clear that she didn’t set out to implicate the smartphone: “It seemed like too easy an explanation for negative mental-health outcomes in teens,” but it ended up the only explanation that fit the timing. Lots of potential culprits, from stressful current events to increased academic pressure, existed before the spike in anxiety that begins around 2011. The only factor that dramatically increased right around the same time as teenage anxiety was the number of young people owning their own smartphones.

 

Returning to our canary-in-the-coal-mine analogy, the plight of iGen provides a strong warning about the danger of solitude deprivation. When an entire cohort unintentionally eliminated time alone with their thoughts from their lives, their mental health suffered dramatically. On reflection, this makes sense. These teenagers have lost the ability to process and make sense of their emotions, or to reflect on who they are and what really matters, or to build strong relationships, or even to just allow their brains time to power down their critical social circuits, which are not meant to be used constantly, and to redirect that energy to other important cognitive housekeeping tasks. We shouldn’t be surprised that these absences lead to malfunctions.

 

But once you begin studying the positive benefits of time alone with your thoughts, and encounter the distressing effects that appear in populations that eliminate this altogether, a simpler explanation emerges: we need solitude to thrive as human beings, and in recent years, without even realizing it, we’ve been systematically reducing this crucial ingredient from our lives.

 

Simply put, humans are not wired to be constantly wired.


- Digital Minimalism: Choosing a Focused Life in a Noisy World, by Cal Newport



Friday, April 23, 2021

Solitude Deprivation

A state in which you spend close to zero time alone with your own thoughts and free from input from other minds.
 
The concern that modernity is at odds with solitude is not new. Writing in the 1980s, Anthony Storr complained that “contemporary Western culture makes the peace of solitude difficult to attain.” He pointed to Muzak and the recent invention of the “car telephone” as the latest evidence of this encroachment of noise into all parts of our lives. Over a hundred years earlier, Thoreau demonstrated similar concern, famously writing in Walden that “we are in great haste to construct a magnetic telegraph from Maine to Texas; but Maine and Texas, it may be, have nothing important to communicate.” The question before us, then, is whether our current moment offers a new threat to solitude that is somehow more pressing than those that commentators have bemoaned for decades. I argue that the answer is a definitive yes.
 
To understand my concern, the right place to start is the iPod revolution that occurred in the first years of the twenty-first century. We had portable music before the iPod, most commonly in the form of the Sony Walkman and Discman (and their competitors), but these devices played only a restricted role in most people’s lives—something you used to entertain yourself while exercising, or in the back seat of a car on a long family road trip. If you stood on a busy city street corner in the early 1990s, you would not see too many people sporting black foam Sony earphones on their way to work.
 
By the early 2000s, however, if you stood on that same street corner, white earbuds would be near ubiquitous. The iPod succeeded not just by selling lots of units, but also by changing the culture surrounding portable music. It became common, especially among younger generations, to allow your iPod to provide a musical backdrop to your entire day—putting the earbuds in as you walk out the door and taking them off only when you couldn’t avoid having to talk to another human.
 
To put this in context, previous technologies that threatened solitude, from Thoreau’s telegraph to Storr’s car phone, introduced new ways to occasionally interrupt time alone with your thoughts, whereas the iPod provided for the first time the ability to be continuously distracted from your own mind. The farmer in Thoreau’s time might leave the quiet fireside to walk to town and check the evening telegraph dispatches, fragmenting a moment of solitude, but there was no way that this technology could offer continuous distraction to this same farmer as he went about his day. The iPod was pushing us toward a newly alienated phase in our relationship with our own minds.
 
This transformation started by the iPod, however, didn’t reach its full potential until the release of its successor, the iPhone, or, more generally, the spread of modern internet-connected smartphones in the second decade of the twenty-first century. Even though iPods became ubiquitous, there were still moments in which it was either too much trouble to slip in the earbuds (think: waiting to be called into a meeting), or it might be socially awkward to do so (think: sitting bored during a slow hymn at a church service). The smartphone provided a new technique to banish these remaining slivers of solitude: the quick glance. At the slightest hint of boredom, you can now surreptitiously glance at any number of apps or mobile-adapted websites that have been optimized to provide you an immediate and satisfying dose of input from other minds.
 
It’s now possible to completely banish solitude from your life. Thoreau and Storr worried about people enjoying less solitude. We must now wonder if people might forget this state of being altogether.

Digital Minimalism: Choosing a Focused Life in a Noisy World, by Cal Newport

Thursday, April 22, 2021

Solitude

 All of humanity’s problems stem from man’s inability to sit quietly in a room alone,” Blaise Pascal famously wrote in the late seventeenth century. Half a century later, and an ocean away, Benjamin Franklin took up the subject in his journal: “I have read abundance of fine things on the subject of solitude. . . . I acknowledge solitude an agreeable refreshment to a busy mind.”
 
The academy was late to recognize the importance of time alone with your own thoughts. In 1988, the noted English psychiatrist Anthony Storr helped correct this omission with his seminal book, Solitude: A Return to the Self. As Storr noted, by the 1980s, psychoanalysis had become obsessed with the importance of intimate personal relationships, identifying them as the most important source of human happiness. But Storr’s study of history didn’t seem to support this hypothesis. He opens his 1988 book with the following quote from Edward Gibbon: “Conversation enriches the understanding, but solitude is the school of genius.” He then boldly writes: “Gibbon is surely right.”
 
Edward Gibbon lived a solitary life, but not only did he produce wildly influential work, he also seemed perfectly happy. Storr notes that the need to spend a great deal of time alone was common among “the majority of poets, novelists, and composers.” He lists Descartes, Newton, Locke, Pascal, Spinoza, Kant, Leibniz, Schopenhauer, Nietzsche, Kierkegaard, and Wittgenstein as examples of men who never had families or fostered close personal ties, yet still managed to lead remarkable lives. Storr’s conclusion is that we’re wrong to consider intimate interaction as the sine qua non of human thriving. Solitude can be just as important for both happiness and productivity.
 
 Woolf would agree with Storr that solitude is a prerequisite for original and creative thought, but she would then add that women had been systematically denied both the literal and figurative room of their own in which to cultivate this state. To Woolf, in other words, solitude is not a pleasant diversion, but instead a form of liberation from the cognitive oppression that results in its absence.
 
Harris argues, perhaps counterintuitively, that “the ability to be alone . . . is anything but a rejection of close bonds,” and can instead affirm them. Calmly experiencing separation, he argues, builds your appreciation for interpersonal connections when they do occur.
 
Wendell Berry summarized this point more succinctly when he wrote: “We enter solitude, in which also we lose loneliness.”

Digital Minimalism: Choosing a Focused Life in a Noisy World, by Cal Newport